How much arctic ice do you melt per year? New study has the answer

The average American adult generates enough carbon through their day-to-day activities to melt approximately 50 square meters (538 square feet) of Arctic sea ice per year, researchers from the  Max Planck Institute for Meteorology in Hamburg, Germany claim in a new study.

In fact, as lead author and climate scientist Dr. Dirk Notz and his colleagues reported Thursday in the journal Science, every metric ton of carbon dioxide released into the atmosphere by people is directly responsible for the loss of three square meters (32 square feet) of Arctic sea ice.

The study was designed to demonstrate the direct impact that that individuals are having on the global climate, the Institute explained in a statement, and also demonstrates that driving a motor vehicle just 90 miles causes nearly a square foot of Arctic ice to melt, CBS News added.

Dr. Notz explained that his team used observations, statistics and 30 different computer models to calculate how much sea ice was lost per ton of CO2 emitted, then used mathematical formulas to determine the impact of the average individual’s actions on that sea ice when it is at its lowest point during the month of September, according to CBS and the Los Angeles Times.

Study reveals how individual actions contribute to climate change

“For us, this is really the first time that we do have an intuitive understanding of how our individual actions really contribute to global warming,” Dr. Notz, who authored the new study with Julienne Stroeve from the National Snow and Ice Data Center in Colorado and University College London, said in an interview with the Times.

“So far, when we talked about global warming, it was always these very big numbers, like billions of tons of carbon dioxide – or very small numbers, like 0.1 degree of temperature change,” he added. “But now suddenly, with this three-square-meter loss per ton of CO2, it gives a very, very concrete and intuitive understanding of how we all cause Arctic sea ice to melt.”

As a greenhouse gas, carbon dioxide traps heat in the planet’s atmosphere, causing temperatures to increase. This, in turn, causes ice around the poles to melt and the sea level to rise, resulting in even more pronounced changes to the climate, according to experts in the field. Human activity, primarily the burning of fossil fuels, is responsible for most of the CO2 emitted worldwide, but researchers had never calculated the impact of the average person – until now.

“Climate change has often felt like a rather abstract notion,” Stroeve said in a statement. “Our results allow us to overcome this perception. For example, it is now straight-forward to calculate that the carbon dioxide emissions for each seat on a return flight from, say, London to San Francisco causes about 5 square meters of Arctic sea ice to disappear.”

The study authors also examined the link between CO2 emissions and sea-ice loss. As Dr. Notz explained, “Put simply, for each ton of carbon dioxide emission, the climate warms a little bit. To compensate for this warming, the sea-ice edge moves northward to a region with less incoming solar radiation. This then causes the sea-ice area to shrink. Simple geometric reasons cause these processes to combine to the observed linearity.”

—–

Image credit: Thinkstock

Study claims your brain may be hard-wired for racism

The human brain may be hard-wired to pick up negative stereotypes regarding groups that tend to be portrayed unfavorably by the media, a neuroscientist from University College London and his colleagues reported in a recent edition of the Journal of Cognitive Neuroscience.

According to The Guardian and the Daily Mail, a group led by Dr. Hugo Spiers of UCL found that the brain has a stronger response to information about groups that are portrayed negatively, suggesting that unfavorable depictions of ethnic or religious minorities can lead to bias.

Dr. Spiers and his colleagues recruited 22 individuals and presented them with descriptions of fictional majority and minority social groups, and used functional MRI (fMRI) scans to monitor their brain activity while they were presented with information about both of these groups.

The two main groups were secretly designated as being “good” or “bad,” with two-thirds of the information presented to the subjects fitting the stereotype and the other one-third running to the contrary. The fMRI scans revealed that, as the participants gathered a view of each ethnic group, activity in part of the brain called the anterior temporal pole revealed an acquired prejudice.

When subjects were exposed to enough stories to feel that the one group was essentially good, activity in the anterior temporal pole rapidly began to decline, the researchers said. However, it continued to respond strongly to negative news regarding the so-called “bad” ethnic group.

Study could reveal the brain’s role in the formation of prejudices

By using this brain activity, Dr. Spiers told The Guardian, scientists could potentially be able to “mathematically track prejudice second by second” in order to determine a person’s level of bias. Furthermore, the scans revealed that the brain had a different response to good and bad news.

“The negative groups become treated as more and more negative. Worse than the equivalent for the positive groups,” the UCL neuroscientist told the UK newspaper. “[But] whenever someone from a really bad group did something nice they were like, ‘Oh, weird.’”

While previous research has identified the parts of the brain involved in racial or gender-based stereotyping, the authors said that this study marks the first time that scientists have attempted to discover how the brain becomes conditioned to link undesirable traits with specific groups, thus laying the groundwork for the formation of prejudices over time.

“The newspapers are filled with ghastly things people do… You’re getting all these news stories and the negative ones stand out,” Dr. Spiers said to The Guardian. “When you look at Islam, for example, there’s so many more negative stories than positive ones and that will build up over time.”

He also told reporters that the future research in the field may reveal whether or not differences in brain structure could explain why some individuals hold racist or sexist views, and could also explore how a person “unlearns” a stereotype – i.e., whether or not the anterior temporal pole is involved in that process as well.

—–

Image credit: Thinkstock

This prototype experiment could “sniff” for life on other planets

NASA researchers are currently working on a new laser-based tool, based on existing military technology, that could one day be used to “sniff” out life on Mars or other planets by detecting the signatures of oxygen, methane or other atmospheric gasses from the surface.

According to Mashable and Washington Post reports, the prototype device is known as the Bio-Indicator Lidar Instrument (BILI) and is based on a sensing technique currently used by the US armed forces to monitor the air for potentially dangerous chemicals, toxins, and pathogens.

The device, which is unlikely to be put into use anytime soon, is unable to directly confirm the existence of biological life on other planets, but would be capable of searching for amino acids and other organic molecules that serve as the building blocks for such organisms, officials with NASA’s Goddard Space Flight Center in Greenbelt, Maryland explained.

While the agency has used instruments similar to BILI to detect chemicals in Earth’s atmosphere as part of its climate-based research, it has yet to utilize the technology on other worlds, Branimir Blagojevic, a technologist from Goddard now working on the project, said in a statement. “If the agency develops it,” he noted, “it will be the first of a kind.”

Device could operate autonomously, reduce sample contamination risk

BILI is a fluorescence-based lidar, meaning that, similar to radar, it would use remote-sensing instruments to detect and analyze particles. Instead of using radio waves, however, it would use light to detect organic bio-signatures from the surface of Mars or elsewhere in the cosmos.

As mentioned above, the device is based on military technology currently in use here on Earth – technology that Blagojevic had previously worked on before joining NASA. As the Washington Post said, it would emit ultraviolet laser pulses which would bounce back upon hitting particles, exciting its electrons and helping scientists determine their size and approximate age.

science, space

An image showing how the detector might eventually work. (Credit: NASA)

The researchers said that they envision BILI as “a rover’s sense of smell,” which means that it would be positioned on the vehicle’s mast and would first scan the terrain for dust plumes. Upon detection, it would then use its UV lasers, causing particles contained in those plumes to glow. “If the bio-signatures are there,” Blagojevic said, “it could be detected in the dust.”

BILI would be of tremendous value because it would be able to detect small levels of complex organic molecules from several hundred meters away in real time, allowing it to autonomously search for signs of life in plumes located atop recurring slopes, which are difficult for rovers to traverse. Also, since it works from a distance, it would reduce the potential risk of contamination of the samples it collects, according to NASA.

“This makes our instrument an excellent complementary organic-detection instrument, which we could use in tandem with more sensitive, point sensor-type mass spectrometers that can only measure a small amount of material at once,” said Blagojevic. “BILI’s measurements do not require consumables other than electrical power and can be conducted quickly over a broad area. This is a survey instrument, with a nose for certain molecules.”

—–

Image credit: NASA

The soda industry funded studies to downplay negative health effects

Studies financially aided by the sugary drinks industry are consistently failing to find a link between their products and problems such as obesity and diabetes, a University of California study has found.
The results of industry-funded studies were compared with independent studies as part of the research.
Dr. Dean Schillinger, professor at the University of California San Francisco and study lead author, believes that the potentially unreliable drinks industry studies are a reason why links between obesity and sugary drinks are considered to be controversial and not definitive.
“That controversy has been manufactured by the industry itself, which is harnessing the science to its own ends,” said the professor.
The research by Schillinger and his colleagues used PubMed, an online compendium of scientific research, to look at the findings of 60 experimental studies which examined how metabolic outcomes were affected by the consumption of sugary drink.
Of the findings, 34 positively associated sugary beverages with obesity, while 26 of them found no association. The 26 studies which found a negative association were all funded by the drinks industry. Only one of the 34 with a positive link had a connection to the industry.
Companies such as Nestle & PepsiCo provided funding for the studies with either direct financial support or as payments to the authors as consultants.
“It’s interesting that they looked at experimental studies,” said Jennifer Harris, a social psychologist at the University of Connecticut Rudd Center for Food Policy and Obesity, who wasn’t involved in the study. “[They are] designed to show a causation effect. If you take out the studies that were industry-funded, there is no controversy. That makes this an important study.”

Susceptibility to bias

Adela Hruby, an expert in nutritional epidemiology, who also wasn’t involved in the study, wants to know what has been omitted.
“Why only PubMed?” she asked. “And why are they limited to publications between 2001 and 2016?”
Schillinger’s research funding was provided by the National Institutes of Health. He has, however, revealed that he was previously paid by San Francisco County’s ordinance regarding billboard advertising for sugar-sweetened drinks.
Hruby concedes that Schillinger’s study is given further kudos by a study published in AIM in 2013. “I don’t think this study is hugely novel, but it does show our susceptibility to bias.”
“Susceptibility needs a closer look”, commented Schillinger, “to determine how and when biased results occur”.
“What was it about the conduct of the scientists that introduced the bias?” he asked. “We need to pay attention to conflicts of interest. And if we’re going to use science to influence consumers’ health decisions, we have to have them think more critically by asking who funded the study.”
—–
Image credit: Thinkstock
 

Chinese firm plans to build the world’s fastest commercial maglev

A Chinese train manufacturer that is among the largest such outfits on the planet is looking to develop a new type of magnetic levitation (maglev) train that will reportedly be able to travel at record-setting speeds of 600 km/hr (373 mph), various media outlets are reporting.

According to The Verge, the China Railway Rolling Stock Corporation (CRRC) has announced that it recently began work on an approximately three-mile (5 km) track on which to test out the new train, which would easily become the fastest-traveling maglev locomotive in operation.

Currently, the fastest such train being used commercially is the Shanghai Maglev, which Smart Rail World reports travels from China’s second most populous city to the Pudong International Airport at a velocity of 268 mph (431 km/hr). However, it should be noted that reports indicate that a noncommercial US Air Force maglev has achieved a speed of 633 mph (1019 km/hr).

Furthermore, Chinese press agency Xinhua reported that CRRC is also working on a relatively tame 124 mph (200 km/hr) maglev train, and that the projects are being undertaken with the goal of establishing new technological standards for medium and high-speed maglev trains which can be used in similar systems all over the world.

New project is part of China’s recent commitment to public transport

In addition to being home to the fastest commercial maglev train system, China can also boast the largest high-speed rail network on Earth, according to The Verge. The country currently has more than 12,400 miles of completed track thanks to $538 billion in government funding.

CRRC official Sun Bangcheng told Xinhua that the new train would consume 10% less energy than the 350 km/h (217 mph) bullet trains currently in use in China. No word yet as to when the company plans to have this ambitious project completed, or how much all of this will cost.

CRRC is also working on rail projects beyond its own national borders, as it is currently leading projects underway in Australia, Indonesia, Iran, Mexico, Russia, Turkey, Thailand and the UK. This comes just two years after China first unveiled plans to create an ambitious maglev system capable of traveling 3000 km/hr (nearly 1865 mph) in Changsha. While it is now operational, it fell well short of that ambitious goal, topping out at a relatively modest 100 km/hr (62 mph).

Maglev trains, for those who may not know, are able to achieve far faster speeds than traditional trains because they replace the wheel-and-track system traditionally used by locomotives with an air cushion and electromagnets that simultaneously pull the front of the train and push it from the rear. Such systems are much faster and more sustainable, but also far more expensive, The Verge noted.

—–

Image credit: Thinkstock

‘Bionic spinach’ can be used to detect bombs

Long known to be a good source of iron and other vitamins and minerals, spinach is believed to lead to stronger muscles, improved eyesight, and healthy blood pressure – and in the right hands, it can be used as a bomb detector, according to a new Nature Materials study.

As BBC News and the Daily Mail are reporting, researchers from the Massachusetts Institute of Technology (MIT) implanted tiny tubes into the leaves of spinach plants which enabled them to pick up nitro-aromatics – chemicals which are found in landmines and buried explosives.

Specifically, they embedded nanoparticles and carbon nanotubes into the leaves. As the plants gather groundwater, scientists can tell in real time if it contains any nitro-aromatics. Shining a laser onto the leaves causes the implanted nanotubes to give off near-infrared fluorescent light, which can be detected wirelessly by a special camera outfitted to a handheld device.

That device, which can either be an inexpensive Raspberry Pi computer or a smartphone which has had its infrared filter removed, then generates an email to the user to notify him or her about the chemical levels detected in the groundwater. Study co-author professor Michael Strano said that the work was an important proof-of-concept that could lead to bigger, better things.

‘Plant nanobionics’ can be used to predict drought, detect pollutants

In a statement, he called the new study “a novel demonstration of how we have overcome the plant/human communication barrier,” noting that plants could be altered to warn about drought conditions or pollutants. In fact, as Strano told BBC News, “our paper outlines how one could engineer plants like this to detect virtually anything.”

Previously, the professor’s lab successfully demonstrated how carbon nanotubes could be used as sensors to detect TNT, hydrogen peroxide, and even sarin nerve gas. They do so by changing the way they glow when exposed to a specific target molecule, demonstrating a new approach to engineering which the MIT researchers have dubbed “plant nanobionics.”

“The plants could be used for defense applications, but also to monitor public spaces for terrorism-related activities, since we show both water and airborne detection. Such plants could be used to monitor groundwater seepage from buried munitions or waste that contains nitro-aromatics,” the professor told BBC News.

With their current set-up, his team said that they can detect signals from the plant at a distance of about one meter, but noted that they are working to expand that distance. Furthermore, they think that the approach could also be adapted to engineer other types of sensors, or even to create other types of bionic plants, including those that could change colors or pick up radio signals.

“Plants are very environmentally responsive. They know that there is going to be a drought long before we do. They can detect small changes in the properties of soil and water potential,” Strano explained. “If we tap into those chemical signaling pathways, there is a wealth of information to access.”

—–

Image credit: Christine Daniloff/MIT

Children raised by gay parents aren’t any different from their peers, study finds

Children adopted by homosexual parents appear to be as well-adjusted through middle childhood as those raised by heterosexual mothers and fathers, according to research published last week by the American Psychological Association (APA) journal Developmental Psychology.

In the study, Dr. Rachel H. Farr, an assistant professor in the University of Kentucky Department of Psychology, recruited 96 adoptive families, half of which featured gay and lesbian parents, and the other half involving heterosexual parents. Then she assessed the wellbeing of children at two key points in their lives: at preschool age, then again about five years later.

She used reports of behavior problems from parents and teachers to assess the children, as well as self-reported levels of parenting stress along with other input from the adoptive caregivers to evaluate parent outcomes and overall family functioning. The analysis revealed no difference in children based on the sexuality of their parents, according to the St. Louis Post-Dispatch.

“To the best of my knowledge, this is the first study that has followed children adopted by lesbian, gay and heterosexual parents over time from early to middle childhood,” Farr said in a statement. “Regardless of parental sexual orientation, children (in the study) had fewer behavior problems over time when their adoptive parents indicated experiencing less parenting stress.”

She added that her research revealed “no differences among (heterosexual and same-sex parent) family types” in terms of behavior problems, stress levels, family functionality and other factors, and that the findings “may be informative to legal, policy and practice realms.”

Stress levels, not sexual orientation, key to good behavioral outcomes

Specifically, Farr explained in an interview with MedicalResearch.com that her research found that adjustment levels among children, parents and couples and overall family functioning were essentially no different based on the sexual orientation of school-aged children. Rather, behavior problems in kids were foretold by earlier issues of child adjustment and parenting stress.

Furthermore, the study concluded that children tended to develop well over time and experience few behavior problems regardless of whether they were adopted by lesbian mothers, gay fathers or heterosexual couples, and that youngsters of elementary school-age tended to report relatively high levels of family functioning, regardless of the sexual orientation of their parents.

Farr told MedicalResearch.com that the findings were “consistent with and extend previous literature about families headed by LG parents, particularly those that have adopted children.” She noted that the results had “implications for advancing supportive policies, practices, and laws related to adoption and parenting by sexual minority adults” and “may also help to move public debate forward about parenting and child outcomes across a diversity of family forms.”

“Regardless of parental sexual orientation, children (in the study) had fewer behavior problems over time when their adoptive parents indicated experiencing less parenting stress,” the professor added in a statement. “Thus, in these adoptive families diverse in parental sexual orientation, as has been found in many other family types, family processes emerged as more important than family structure to longitudinal child outcomes and family functioning.”

—–

Image credit: Thinkstock

NASA’s ‘Intruder Alert’ system spots an asteroid heading toward Earth

While most of us were sleeping Sunday night, an asteroid zoomed past the Earth at a distance of just over 300,000 miles (about 483,000 km) – but don’t worry, because NASA’s new space rock detection system was tracking the object and knew there was virtually no chance of impact.

So what is this new technology keeping our planet safe from harm? According to NPR and other media outlets, it’s a computer program named Scout that is currently being tested at the US space agency’s NASA Jet Propulsion Laboratory (JPL) in Pasadena, California, and it constantly looks at data collected by observatories all over the world searching for nearby asteroids.

If it discovers one of these so-called “near-Earth objects,” it then quickly calculates whether the planet is in any real danger, and if so, instructs telescopes to continue monitoring the space rock. In this case, it spotted the asteroid known as 2016 UR36 on October25, giving NASA five days to prepare a response if needed – far more than they had in the past, said ScienceAlert.

“When a telescope first finds a moving object, all you know is it’s just a dot, moving on the sky,” JPL astronomer Paul Chodas told NPR. “You have no information about how far away it is.”

“The more telescopes you get pointed at an object, the more data you get, and the more you’re sure you are how big it is and which way it’s headed. But sometimes you don’t have a lot of time to make those observations,” he added. Thanks to Scout, however, they now have a bit longer to prepare for a potential asteroid impact and to come up with a way to mitigate the threat.

Scout, Sentry teaming up to keep Earth safe from dangerous space rocks

In the case of 2016 UR36, the asteroid was detected on October 25 by the Panoramic Survey Telescope and Rapid Response System (PAN-STARRS) array in Maui, Hawaii, Motherboard  and ScienceAlert explained. Within 10 minutes, Scout was aware of the threat and calculated a number of different potential flight paths, some of which intersected with the Earth.

The program then proceeded to notify other observatories to conduct additional observations of the asteroid, and within a matter of hours, it had determined that the rock would fly by the planet at a distance of 310,000 miles (about 499,000 km) – meaning it would be further away than the moon. Furthermore, it determined that 2016 UR36 was between 5 and 25 meters across.

Scout, which is currently in the testing phase, could become fully operational before the end of the year, according to NPR. The main goal of the project, JPL’s Davide Farnocchia noted, is to “speed up the confirmation process” – and while five days notice may not seem like much, it is far more than the 19 hour window separating the detection of 2008 TC3 and its explosion over the Sudan in October 2008, which occurred 12 hours after it was declared a threat.

While Scout is used to detect smaller objects, NASA also uses a complementary program, known as Sentry, to keep tabs on larger potential impactors. Sentry tracks near-Earth objects larger than 140 meters in length, and has to date compiled a list of more than 650 with the potential to cause significant damage, should they reach the planet’s surface. By spotting such threats early, NASA hopes that they will have enough time to stop them and prevent catastrophic damage.

“If you know well in advance, and by well in advance I mean 10 years, 20 years, 30 years in advance which is something we can do, “ Ed Lu, CEO of an asteroid threat organization called B612, told NPR.  “Then you can divert such an asteroid by just giving it a tiny nudge when it’s many billions of miles from hitting the Earth.”

—–

Image credit: NASA

Brain region responsible for the ‘placebo effect’ identified

By using functional magnetic resonance brain imaging (fMRI) technology, researchers from the  Rehabilitation Institute of Chicago (RIC) and Northwestern University have identified the part of the human brain responsible for the pain-relief phenomenon known as the placebo effect.

As reported earlier this week by Psych Central and the Pain News Network, Dr. Marwan Baliki, a research scientist at RIC as well as an assistant professor at Northwestern University Feinberg School of Medicine, and his colleagues identified a region in the front part of the brain (the mid frontal gyrus) responsible for providing real relief when patients take phony treatments.

The researchers conducted two separate trials involving 95 patients suffering from chronic pain due to osteoarthritis. In the first, they found that the mid frontal gyrus was better connected with other parts of the brain in approximately half of the patients, and were more likely to respond to the placebo effect. In the second, the initial findings were validated with 95% certainty.

“Given the enormous societal toll of chronic pain, being able to predict placebo responders in a chronic pain population could both help the design of personalized medicine and enhance the success of clinical trials,” Dr. Baliki said in a press release. His team’s findings were published Thursday in the open-access, peer-reviewed scientific journal PLOS Biology.

Findings could mark the end of trial-and-error pain treatments

Identifying the region of the brain responsible for the pain-killing placebo effect could result in the development of personalized treatment options for the millions of people in the US suffering from chronic pain, the researchers explained. It could also lead to more precise clinical trials for pain medications by disqualifying potential subjects that respond strongly to placebos.

As part of their work, Dr. Baliki and his colleagues used fMRI and a standard clinical trial design to find a brain-based neurological marker that could predict pain-relieving affects in patients who experienced chronic knee pain due to osteoarthritis. In fact, more than half of the participants had reported significant pain relief, according to the study authors and if future studies can expand on these results, they could provide a brain-based predictive option for individual patients.

That, the researchers said, would decrease the exposure of patients to ineffective therapies while also decreasing the time and magnitude of pain and opioid use. In short, it would prevent doctors from having to use a trial-and-error approach to treatments which might expose patients to drugs that could be ineffective and potentially dangerous.

“The new technology will allow physicians to see what part of the brain is activated during an individual’s pain and choose the specific drug to target this spot,” said co-author Dr. Vania Apkarian, a physiology professor at Northwestern. “It also will provide more evidence-based measurements. Physicians will be able to measure how the patient’s pain region is affected by the drug.”

—–

Image credit: Thinkstock

New Horizons finally finished transmitting its Pluto data

This week marked the end of an era for NASA researchers, as the space agency’s New Horizons mission transmitted the last data collected during its history-making July 2015 flyby of the dwarf planet Pluto via downlink to the Deep Space Network  facility in Canberra, Australia.

“The Pluto system data that New Horizons collected has amazed us over and over again with the beauty and complexity of Pluto and its system of moons,” principal investigator Alan Stern from the Southwest Research Institute in Boulder, Colorado, explained Thursday in a press release.

“There’s a great deal of work ahead for us to understand the 400-plus scientific observations that have all been sent to Earth,” he added. “And that’s exactly what we’re going to do – after all, who knows when the next data from a spacecraft visiting Pluto will be sent?”

The final data downlink comes 15 months after New Horizons completed its flyby of Pluto and its moons – a mission that saw the spacecraft travel more than 3.1 billion miles. During its pass, it collected nearly 100 times more data than it could have sent home before moving on, as it was programmed to select high-priority data first before sending the rest starting in Sept. 2015.

So what was in the spacecraft’s final data transmission?

The last data transmission consisted of a digital observation sequence of Pluto and the largest of its moons, Charon, collected by the probe’s Ralph/LEISA imager on July 14, 2015, according to GeekWire. It was sent to the Johns Hopkins Applied Physics Laboratory (APL) in Maryland, via the Canberra Deep Space Network station at 5:48am EDT last Tuesday.

In total, New Horizons sent back well over 50 gigabits of data pertaining to the Pluto system, the website added. That doesn’t sound like a lot of data, so what took so long for the probe to send it all back to Earth. The answer, as it turns out, is that the spacecraft needed to transmit information at speeds of a mere 2,000 bits per second, or roughly the same speed as a 1980s modem.

“We have our pot of gold,” said Alice Bowman, Mission Operations Manager from APL.

Although scientists will still need to fully analyze these newly-delivered observations, the Pluto chapter of New Horizons’ story is now officially closed. Next up for the spacecraft is a voyage to the Kuiper Belt and a flyby of a tiny, reddish-colored  Kuiper Belt object known as 2014 MU69. That encounter is currently scheduled to begin on January 1, 2019.

“We’re excited about the exploration ahead for New Horizons, and also about what we are still discovering from Pluto flyby data,” Stern said in a statement earlier this month. “Now, with our spacecraft transmitting the last of its data from last summer’s flight through the Pluto system, we know that the next great exploration of Pluto will require another mission to be sent there.”

—–

Image credit: NASA

Stephen Hawking and Mark Zuckerberg team up to search for alien life

KIC 8462852 – also known as Tabby’s Star – has sparked a lot of excitement recently for its unusual dimming pattern.

Although the pattern is likely due to a swarm of comets or the remnants of a smashed planet blocking out light from the star, that hasn’t stopped people from being drawn to the highly remote possibility of an alien ‘megastructure’ being the cause of the blinking pattern.

Now, the University of California Berkley has announced plans to examine the curious star with the Green Bank radio telescope through its Breakthrough Listen project. The telescope is nearly 330 feet wide and located in West Virginia.

“The Breakthrough Listen program has the most powerful SETI (search for intelligent life) equipment on the planet, and access to the largest telescopes on the planet,” Andrew Siemion, director of the Berkeley SETI Research Center, said in a news release. “We can look at it with greater sensitivity and for a wider range of signal types than any other experiment in the world.”

Searching for Alien Life

The Breakthrough Listen project was established last year thanks to $100 million in funding over 10 years from the Breakthrough Prize Foundation, created by internet investor Yuri Milner. The project won’t be the first such program to scrutinize Tabby’s Star for signs of intelligent life.

“Everyone, every SETI program telescope, I mean every astronomer that has any kind of telescope in any wavelength that can see Tabby’s star has looked at it,” Milner said. “It’s been looked at with Hubble, it’s been looked at with Keck, it’s been looked at in the infrared and radio and high energy, and every possible thing you can imagine, including a whole range of SETI experiments. Nothing has been found.”

While the UC Berkeley team said they are skeptical the star’s odd pattern is an indication of a sophisticated alien civilization, they say it still deserves study. Observations of the star are slated for 8 hours a night for three nights over the next two months. The researchers have traveled to the Green Bank Observatory in rural West Virginia to begin the observations, and said they plan to accumulate around 1 petabyte of data over hundreds of millions of radio channels.

—–

Image credit: Thinkstock

Meet the bird that can fly non-stop for 10 months straight

Even the best distant runners and long-distance swimmers on the planet need a break every once in a while, but scientists have found a species of bird which fly during exceptionally long periods of time during its annual migration without needing to stop and catch its breath.

According to NPR and National Geographic, researchers from Lund University in Sweden have monitored the flight patterns of the common swift (Apus apus) and found that the birds can spend nearly their entire 10-month nonbreeding period soaring through the skies without stopping.

Specifically, the husband-and-wife team of Anders Hedenström and Susanne Åkesson fitted 19 swifts with lightweight data loggers in 2013, which they were able to use to identify the position of the birds and which recorded their body position, activity, and acceleration levels to determine if they were flying or resting at any given time.

This bird don't stop to eat, drink, or maybe even sleep.

This bird don’t stop to eat, drink, or maybe even sleep.

As they reported this week in the journal Current Biology, they found that the birds spent at most 0.64% of their migration on the ground, and that some of the birds were almost constantly active during this time. In fact, one of them rested for just four days total between September 2013 and April 2014 and stopped for just two hours during the same time period the following year.

“They feed in the air, they mate in the air, they get nest material in the air,” Åkesson explained to Nat Geo on Thursday. “They can land on nest boxes, branches, or houses, but they can’t really land on the ground” because their wings are too long and their legs too short to take flight from a flat surface, she added.

Study findings suggest that swifts sleep while in midair

Some of the swifts were forced to take breaks more frequently, Åkesson and Hedenström noted. They believe that this was because their tail and wing feathers had not been completely replaced following their annual molt, compromising their ability to remain airborne. Even so, not one of them spent even 1% of their 10-month nonbreeding period on the ground.

Common swifts are known to fly south from Europe to sub-Saharan Africa, NPR said, but they apparently do not land there, as scientists have yet to discover any roosting sites in that region of the world. The rarity of the stops, the authors told Nat Geo, suggests that they typically stop only for bad weather – indicating that the species may not need to land in order to sleep.

In fact, Åkesson and Hedenström wrote that it is unclear “when and to what extent swifts need to sleep,” and they’re not alone in that regard, according to reports. Earlier studies have discovered that alpine swifts (which are twice the size of common swifts) may have flown from Switzerland to Africa without stopping for 200 days. These lengthy nonstop flights indicate that swifts might be able to sleep while flying, much like frigate birds have been proven to do.

The researchers also found that the birds appeared to be less active during the day than they were at nighttime, which they explained was “most likely due to prolonged gliding episodes during the daytime when soaring in thermals.” They also wrote that the results “have important implications for understanding physiological adaptations to endure prolonged periods of flight, including the need to sleep while airborne.”

—–

Image credit: Stefan Berndtsson

First-ever fossilized dinosaur brain tissue discovered

Paleontologists have confirmed the discovery of the first ever fossilized dinosaur brain tissue, a 130-million-year-old specimen that was discovered in the UK and which resembled a tiny brown pebble, according to new research reported Thursday by the Geological Society of London.

According to the Guardian and the New York Times, the brain tissue was originally discovered in East Sussex, England by an amateur fossil hunter named Jamie Hiscocks in 2004. The fossil was a cast of the dinosaur’s brain cavity which appeared to contain mineral tissues, the reports said.

Forensic analysis of the specimen confirmed the presence of blood vessels and capillaries, tissues from the outer cortex, and the tough tissues that surround the brain itself and help keep it in place (meninges). The tissues were said to be similar to those found in modern birds and crocodiles.

“As we can’t see the lobes of the brain itself, we can’t say for sure how big this dinosaur’s brain was,” University of Cambridge paleontologist Dr. David Norman, who co-led the research along with late University of Oxford Professor Martin Brasier before Brasier’s untimely death in 2004, said in a statement.

“Of course, it’s entirely possible that dinosaurs had bigger brains than we give them credit for,” he added, “but we can’t tell from this specimen alone. What’s truly remarkable is that conditions were just right in order to allow preservation of the brain tissue – hopefully this is the first of many such discoveries.”

Creature’s brain tissue survived because it was ‘pickled’

The rather ordinary looking, brown, pebble-shaped fossil was discovered by Hiscocks near the seaside town of Bexhill in East Sussex 12 years ago. It is believed to have belonged to a species similar to Iguanodon – a large herbivore which lived during the Early Cretaceous Period.

Researchers were stunned by the discovery because finding any fossilized soft tissue is rare and the odds of discovering preserved brain tissue are “incredibly small,” according to co-author and Cambridge scientist Dr. Alex Liu, a doctoral student of Brasher’s. Dr. Liu added that “the discovery of this specimen is astonishing.”

So exactly how were this creature’s brain tissues preserved. The study authors believe that it had essentially been “pickled” in a bog- or swamp-like body of water that had a high acid content and low amounts of oxygen. This would have allowed the soft brain tissues to have been mineralized before completely decaying, making it so that they could be preserved all this time.

“What we think happened is that this particular dinosaur died in or near a body of water, and its head ended up partially buried in the sediment at the bottom,” Dr. Norman explained. “Since the water had little oxygen and was very acidic, the soft tissues of the brain were likely preserved and cast before the rest of its body was buried in the sediment.”

With the help of colleagues from the University of Western Australia, the UK researchers used scanning electron microscope (SEM) technology to identify the meninges and strands of collagen and blood vessels. They also discovered what they believe could be neural tissues and capillaries, and determined the structural similarities between modern-day dinosaur descendants.

“I have always believed I had something special… but it wasn’t until years later that its true significance came to be realized,” said Hiscocks, who was also a co-author on the newly published paper. “In his initial email to me, Martin asked if I’d ever heard of dinosaur brain cells being preserved in the fossil record. I knew exactly what he was getting at. I was amazed to hear this coming from a world renowned expert like him.”

—–

Image credit: Cambridge University

New HIV research debunks ‘Patient Zero’ myth

A Canadian flight attendant long known as “Patient Zero” and blamed from bringing HIV into the US has been exonerated by new research published this week in the journal Nature that has found that the AIDS-causing virus had reached the States earlier than previously believed.

A team led by University of Arizona evolutionary biologist Dr. Michael Worobey genetically sequenced samples collected by early HIV-infected individuals and found that the virus likely came to America from Haiti in 1970 or 1971, according to NPR and the Los Angeles Times.

The virus went undetected by doctors for years, the report indicates, arriving in New York City from the Caribbean and circulating there for at least half a decade before an infected individual traveled to San Francisco, bringing the virus with him to the west coast. By the late 1970s, Dr. Worobey and his colleagues wrote, there were thousands of people infected with HIV.

So when a CDC investigation into the sexual activities of gay and bisexual men uncovered, in March 1984, that Canadian-born Gaëtan Dugas was at the center of a network of sexual partners responsible for helping to spread the disease, the immunodeficiency virus already had been here for more than a decade already. In short, he couldn’t have possibly been “Patient Zero.”

Disease was fairly widespread years before ‘Patient Zero’ diagnosis

Dr. Worobey and his colleagues conducted a genetic analysis which found that the most common strain of HIV traveled from the Caribbean to New York sometime around 1970, circulating there for approximately five years before it spread to other parts of the US after one infected man took it to California, analysis of samples originally collected to test for hepatitis revealed.

“There really is no question about the geographical direction of movement,” Dr. Worobey told the Times. He and his team found evidence of this spread by analyzing serum samples collected from gay men in New York and San Francisco in the late 1970s, which were tested for hepatitis (which was prevalent among homosexuals at the time) but not HIV, as it was not known to exist at the time. They found HIV-fighting antibodies present in the collected blood samples.

Specifically, 6.6% of the New York samples and 3.7% of the San Francisco ones contained HIV antibodies, and while that revealed the existence of the virus, it did not provide the scientists with any detailed information about the pathogen. So they searched for fragments of HIV RNA in the samples and managed to compile full virus genomes in eight of them, according to reports.

So while Dugas was instrumental in bringing HIV into the public consciousness, even flying to a CDC facility in Atlanta to donate blood and providing officials with a list of other men who may have been infected, according to NPR, he was not the man who brought the disease to the US. In fact, the new research reveals that by the late 1970s, as many as 7% of gay men in New York and 4% in San Francisco had already been infected – years before Dugas’s case came to light.

“To me, there’s something nice about going back and correcting the record. He has been blamed for things that no one should be blamed for,” Dr. Worobey told NPR. “Nobody should be blamed for the spread of a virus that nobody even knew about,” he added in an interview with the Times.

—–

Image credit: Thinkstock

Scientists discover the birth of a triple-star system

Using the the Atacama Large Millimeter/submillimeter Array (ALMA) and the Karl G. Jansky Very Large Array (VLA), astronomers from the US and the Netherlands have captured observations of dust surrounding a young star splitting into a multiple-star system for the first time.

While scientists had long suspected that such a process took place, due largely to gravitational instability, this marks the first time they have witnessed it occurring. Their findings have been published in the Oct. 27 edition of the peer-reviewed journal Nature.

As lead author John Tobin from the University of Oklahoma and the Leiden Observatory said in a statement, “This new work directly supports the conclusion that there are two mechanisms that produce multiple star systems – fragmentation of circumstellar disks, such as we see here, and fragmentation of the larger cloud of gas and dust from which young stars are formed.”

Stars originally form inside of giant gas and dust clouds, when the materials inside those clouds start to collapse into denser cores, causing them to draw even more matter inward and causing that matter to form a rotating disk around the forming star. Once it acquires enough mass, it will generate the core pressure and temperature required for thermonuclear reactions.

Researchers believe they can find other examples of this phenomena

Previous research has revealed that there are essentially two kinds of multiple star systems. In one, the companion stars are relatively close (within 500 times the distance between the Sun and the Earth). In the other, the stars are much farther apart (more than 1,000 times that distance).

The differences in these distances, astronomers have determined, is due to different mechanisms of formation. The star systems that are further apart form when the dust cloud fragments through turbulence, while those that are closer were thought to form due to the fragmentation of a smaller disk surrounding a young protostar. However, there was no evidence to support that notion.

Now, Tobin and his fellow researchers have used ALMA and the VLA to observe a young triple-star system named L1448 IRS3B, located approximately 750 light years from Earth in a cloud of gas in the constellation Perseus. The star in the center of this young system is separated from the other two by 61 times and 183 times the Earth-Sun distance, respectively, they explained, and all three are surrounded by a disk of material that is unstable due to its spiral shape.

Based on their analysis, they have determined that the system is less than 150,000 years old, and that the star furthest from the others is no more than 20,000 years old. The system provides what the study authors are calling the first direct observational evidence that fragmentation in the disk is capable of producing young multiple-star systems, and Tobin said that they hope to find other examples of this process to see “how much it contributes to the population of multiple stars.”

—–

Image credit: NRAO/AUI/NSF

Earth could lose 2/3 of all wildlife by 2020

Global wildlife populations have decreased by more than half since 1970, and will likely fall by two-thirds that number in some instances before the end of the decade, according to a new report released earlier this week by the WWF and the the Zoological Society of London (ZSL).

According to BBC News and New Scientist, the groups’ Living Planet Index assessment found that overall populations of wildlife have fallen 58% over the past 46 years, and that most of those losses have involved species living in lakes, rivers, and wetlands. The primary causes, the WWF and ZSL said, are habitat loss, wildlife trade, pollution, and man-made climate change.

The assessment measured population numbers of 14,152 populations of 3,706 species of birds, fish, reptiles, amphibians and mammals from all over the world, the media outlets reported. The results show an average decrease in population numbers of 2% per year. Furthermore, by 2020, as much as 67% of the vertebrate species populations may wind up being wiped out in a “mass extinction” event unless the damage caused by human activity is reversed, they added.

“For the first time since the demise of the dinosaurs 65 million years ago, we face a global mass extinction of wildlife,” Mike Barrett, WWF-UK’s science and policy director, said Thursday in a statement. “We ignore the decline of other species at our peril – for they are the barometer that reveals our impact on the world that sustains us.”

“Humanity’s misuse of natural resources is threatening habitats, pushing irreplaceable species to the brink and threatening the stability of our climate,” he added. “We know how to stop this. It requires governments, businesses and citizens to rethink how we produce, consume, measure success and value the natural environment. In the UK, this demands a serious plan to strengthen protection for habitats and species and new measures to fast track low-carbon growth.”

Study’s findings called into question by some experts

The report warned that unsustainable fishing and agriculture are increasingly affecting species, and as New Scientists reported, mining, pollution and climate change are also factors which are leading to the demise en masse of creatures such as African elephants in Tanzania (which have been victims of poaching) and Brazil’s maned (which are losing habitat to farming).

Humans are also being victimized by their own activities, the report said, due to the increasing losses of plants needed for breathable air, as well as sources of drinkable water and food. Some of the news is good, however, as the study found a slight increase in grassland species over the last 12 years, which is being attributed to conservation efforts for some African mammals.

Terrestrial species have seen a population decrease of 40% since 1970, the study found. Avian populations are also on the decline, but freshwater species have seen an 80% decline during the last four-plus decades. Wetland wildlife species have seen an increase in numbers since 2005, New Scientist said, while marine species have reportedly stabilized since 1988.

“Human behavior continues to drive the decline of wildlife populations globally, with particular impact on freshwater habitats,” ZSL science director Professor Ken Norris said in a press release Thursday. “Importantly, however, these are declines – they are not yet extinctions,” he said, “and this should be a wake-up call to marshal efforts to promote the recovery of these populations.”

However, BBC News pointed out that the study had drawn criticism from some experts in the field, including Duke University conservation ecologist Stuart Pimm, who said that while some of the data was “sensible,” some of the numbers cited by the report were “very, very sketchy… They’re trying to pull this stuff in a blender and spew out a single number… It’s flawed.”

“If you look at where the data comes from… it is massively skewed towards western Europe,” Pimm told the British news outlet. “When you go elsewhere, not only do the data become far fewer, but in practice they become much, much sketchier… there is almost nothing from South America, from tropical Africa, there is not much from the tropics, period. Any time you are trying to mix stuff like that, it is is very very hard to know what the numbers mean.”

—–

Image credit: Thinkstock

ExoMars mission continues to thrive despite loss of lander

Despite the apparent loss of the Schiaparelli lander, the other half of the ExoMars 2016 mission, the Trace Gas Orbiter (TGO), has successfully entered the Red Planet’s orbit and will continue to function as expected, officials from the European Space Agency (ESA) have confirmed.

According to Gizmodo and The Verge, ESA officials believe that the lander likely experienced a computer glitch during its descent which made it erroneously think it was closer to the surface of Mars than it actually was, disrupting the spacecraft’s landing sequence and causing it to crash.

Last week, the Schiaparelli’s remains were spotted by the NASA Mars Reconnaissance Orbiter (MRO), and as Nature World News noted, ESA officials believe that the lander fell from a height of between two and four kilometers, impacting at a speed of over 300 km/h and likely exploding on impact. However, the investigation into exactly what caused the incident continues.

As Andrea Accomazzo, head of solar and planetary missions for the ESA, explained, the cause can likely be traced back to a yet-unidentified glitch that gave the lander incorrect data about its position in space, causing landing procedures to execute as if it were at a much lower altitude.

“If confirmed, this would actually be good news, as software issues are much easier to correct than hardware problems,” Gizmodo said. “Researchers on the ExoMars team are confident in the integrity of Schiaparelli’s hardware, and they’re now hoping to replicate the software error using a simulation” so that they can design, implement and test a potential fix for the issue.

Orbiter ‘healthy,’ preparing to begin science operations next year

While much of the media attention regarding the ExoMars mission has involved Schiaparelli and its crash, ESA officials are quick to point out that the other spacecraft involved in the project, the TGO, has entered orbit around the Red Planet and is continuing to function as planned.

The orbiter is “looking well and healthy” and remains “well within the planned initial orbit,” said Nature World News. In March, it is scheduled to undergo a maneuver to correct its trajectory and bring it to a circular orbit at an altitude of 400 km (250 miles) above the Martian surface. Shortly thereafter, it will begin a two-year mission to identify and catalog atmospheric gases.

According to Chemical and Engineering News, TGO has been outfitted with spectrometers and cameras that it will use to trace nitrogen oxide, acetylene, and methane in the air surrounding the Red Planet. It will also be able to detect hydrogen, a potential indicator of water ice, at depths of up to one meter below the ground.

“Scientists are particularly interested in Mars’s production of methane,” the website said. This is because the majority of atmospheric methane on Earth is produced by microbial life, although it can also be produced by natural geological processes. A methane spike was detected by NASA’s Curiosity rover on Mars in 2014, and ESA scientists hope that the TGO will help determine the still-unknown source of those readings.

—–

Image credit: ESA

Religious people don’t fully understand the world, new study claims

New research certain to cause controversy among the religious faithful is claiming that men and women who believe in God tend to be worse at math and have an overall worse understanding of the world’s physical and biological phenomena, CNET and The Independent report.

As part of their research, Marjaana Lindeman and Annika Svedholm-Häkkinen of the University of Helsinki, surveyed 258 individuals about their beliefs – specifically, if they thought there was a God and whether or not they believed in paranormal phenomena. They also subjected each of the subjects to a series of problem-solving tests that measured their ability to think scientifically. Their study is published in the journal Applied Cognitive Psychology.

The researchers found that people who believed in an all-powerful, omniscient deity, as well as those who believed in the supernatural, were comparable to those with autism spectrum disorders in that they struggled to understand the realities of the world in which they lived. Spiritual beliefs were also associated with a reduced ability to understand things like flowers, rocks and the wind without attributing human qualities to them, according to media reports of the study.

“The more the participants believed in religious or other paranormal phenomena, the lower their intuitive physics skills, mechanical and mental rotation abilities, school grades in mathematics and physics, and knowledge about physical and biological phenomena were… and the more they regarded inanimate targets as mental phenomena,” the authors told The Independent.

Belief in God and the paranormal compared to autism, small children

The authors defined “mental phenomena” as the inability to understand the physical world and the need to use supernatural explanations for natural processes, “resulting in belief in demons, gods, and other supernatural phenomena.” The same confusion between mental and physical qualities, Lindeman and Svedholm-Häkkinen continued, “has [also] been recognized mainly among ancient people and small children.”

Like autistic people, the researchers wrote, those with religious convictions and those believing in the paranormal have difficulty distinguishing between the mental and the physical, except that people with autism struggle in the opposite way, as they view the world as entirely physical and have difficulty accepting the mental attributes of others.

Lindeman and Svedholm-Häkkinen based their views on surveys of 258 Finnish people, each of whom were asked to report the degree to which they believed that “an all-powerful, all-knowing, loving God” existed and whether or not they believed in supernatural phenomena such as psychic powers or telepathy. They then matched those responses to exam results, test scores and answers on other surveys to draw their conclusions.

The scientists found that, overall, those who believe in God and the paranormal are more likely to be female and to base their actions on instinct instead of analysis or critical thinking. Previous research has found that religious men and women tend to have a lower IQ, while also tending to be happier, more generous and more trustworthy than non-believers, The Independent said.

—–

Image credit: Thinkstock

Is going vegan another way to fight fibromyalgia?

VEGAN fibro

A number of studies have been conducted to specifically answer the question of whether a vegan diet will be helpful to those with fibromyalgia. However, most of these studies have focused on raw vegan diets. Let’s make some brief distinctions before moving forward in order to avoid confusion. A vegetarian diet is plant-based, but also includes the consumption of animal products such as milk, cheese, and eggs, while excluding the consumption of any flesh whatsoever. A vegan diet is entirely plant-based and excludes all animals and animal products. A raw vegan diet is the same as vegan, only the food is typically fresh, whole, and uncooked.

So what’s the answer to the question? Does a vegan diet help fight against fibromyalgia? While the scientific community is always hesitant to state the obvious, their research shows the answer is a rather definitive yes. One study that featured a raw vegan diet showed significant improvement in shoulder pain, flexibility, and a 6-minute walk. In fact, after four months, patients reported a 46% improvement. The study participants ate raw fruits, nuts, grain products, salads, carrot juice, and more. A similar raw vegan diet study found that fibromyalgia patients significantly improved in the areas of pain, joint stiffness, quality of sleep, weight, and overall health.

It is no secret that corporate agriculture whose products dominate your local grocer, spray an array of chemicals over their crops which are almost exclusively genetically modified. A random search for the effects that these chemicals have on the workers who use them is horrifying. Symptoms run the gamut from nausea and dizziness to cancer, genetic mutations, and death. Think about that for a moment… and then imagine the effects of these chemicals on you who suffer from fibromyalgia with a heightened sensitivity to pain, medications, and chemicals. So if you can’t or won’t give up meat, then at least consider upgrading to organic and non-GMO produce and products as often as possible. You may notice a significant difference in some of your symptoms from that alone.

Author of The Autoimmune Solution, Dr. Amy Myers, says that, “Dairy is one of the most inflammatory foods in our modern diet, second only to gluten,” resulting in bloating, gas, constipation, and diarrhea to name just a few symptoms. These factors matter to those suffering from the debilitating effects of fibromyalgia because it’s like feeding your problem. Specifically, rapid growth hormones coupled with high fat content consumed by someone who can’t be very active due to pain and fatigue leads to rapid weight gain. This makes the pain worse, sluggishness increase, and often compounds depression. Additionally, inflammation can be a fibromyalgia patient’s worst nightmare.

Don’t stress about this. You have enough to deal with. But think for a moment about all the foods you already eat that are vegan: fruits, vegetables, hummus, salad, salsa and pico de gallo, seeds, nuts, and many breads. It’s easier than you think. Do your research and figure out what works best for your body. Try something for a few months and keep a journal or notebook of your progress. This will help determine what is most beneficial for you.

Earliest known evidence of right-handedness discovered

Statistics show that as many as nine out of 10 people are right-handed, and while it is difficult to say exactly when this particular preference first arose in humans, a new discovery by researchers from the University of Kansas reveals that it has been around for at least 1.8 million years.

Evidence to support the claim came in the form of an recently-found upper jawbone belonging to a human ancestor known as Homo habilis, which National Geographic explained lived in eastern and southern Africa between 1.4 and 2.4 million years ago, and which was believed to have been a regular user of stone tools (as suggested by their presence around the fossil discovery site).

Writing last week in the Journal of Human Evolution, KU anthropology professor David Frayer and his colleagues reported that the jawbone still contained its teeth, and that diagonal scratches (or, more formally, labial striations) on these teeth were likely made when the individual nicked them while using a stone tool held in his or her right hand.

“We think that tells us something further about lateralization of the brain,” Frayer, who was the lead author of the recently-published paper, said last Thursday in a statement. “We already know that Homo habilis had brain lateralization and was more like us than like apes,” he added. “This extends it to handedness, which is key,” because as he pointed out to Nat Geo, right-handedness, brain lateralization and language are linked – they “all fit together in a package.”

Findings could provide clues about other areas of human development

Frayer’s team identified the labial striations on the lip side of anterior teeth located in the upper jaw of a fossil known as OH-65, which was found near Tanzania’s Olduvai Gorge. The deep cut marks were found on the lip face of the upper front teeth, with most veering from the left down to the right, suggesting they were made with a tool held in the individual’s right hand.

More specifically, analysis of the scratches made using a microscope indicated that they were the result of the individual using a tool held in the right hand to cut food being held in the mouth and while he or she was pulling on it with the left hand. The subject (who was possibly female) made the marks by occasionally missing and striking the labial face, Frayer said.

Based on the direction of those marks, he added, it becomes apparent that the Homo habilis was right-handed, making this the oldest known evidence of a pre-Neanderthal with a dominant hand. Since the right hand is controlled by the brain’s left hemisphere, which is the same region of the brain that controls language, the discovery could provide some clues regarding the evolution of language in our ancestors.

“Handedness and language are controlled by different genetic systems, but there is a weak relationship between the two because both functions originate on the left side of the brain,” Frayer explained. “One specimen does not make an incontrovertible case, but as more research is done and more discoveries are made, we predict that right-handedness, cortical reorganization and language capacity will be shown to be important components in the origin of our genus.”

“We think we have the evidence for brain lateralization, handedness and possibly language, so maybe it all fits together in one picture,” he added. Scientists believe that it is likely that brain reorganization, tool use and dominant-handedness all occurred at an early point in our evolution, and the KU researchers may be close to finding the exact point where they occurred.

—–

Image credit: University of Kansas

This plant uses a quantum trick to get its beautiful blue leaves

A species of begonia living on the floor of tropical rainforests has blue leaves because of a trick of quantum mechanics that allows the plant to extract a greater amount of energy from dim light and prevents it from dying out, according to a new University of Bristol study.

In fact, as Dr. Heather Whitney and her colleagues explained in Monday’s edition of the journal Nature Plants, the Malaysian species Begonia pavonina uses a type of nanotechnology known as photonics to create special structures in their leaves that allow them to effectively harvest energy in low-light conditions, but which causes their leaves to turn a shiny cobalt color.

These blue leaves, Popular Mechanics and Gizmodo said, enable the begonias to wring a greater amount of energy from the slight amount of red-green light that travels through the forest canopy and makes it to the floor of the rainforest. The new research shows that the coloring of the plants is essential for their survival and is not simply for appearance sake, the authors noted.

“It’s actually quite brilliant,” Whitney told Popular Mechanics on Monday. “Plants have to cope with every obstacle that’s thrown at them without running away. Here we see evidence of a plant that’s actually evolved to physically manipulate the little light it receives. It’s quite amazing, and was an absolutely surprising discovery.”

Findings could be used to improve solar cells, increase crop yields

The Bristol team’s research solves a longstanding mystery surrounding these blue leaves, which some experts has hypothesized may have been an adaptation to scare off potential predators or a way to prevent the plant from being exposed to more light than it could handle. In fact, the study authors found that exposing the plants to more light caused the blue color to slowly fade.

“We discovered under the microscope, individual chloroplasts in these leaves reflected blue light brightly, almost like a mirror,” Matt Jacobs, a doctoral student at the Bristol School of Biological Sciences and the first author of the study, explained in a press release. “Looking in more detail by using a technique known as electron microscopy, we found a striking difference between the ‘blue’ chloroplasts found in the begonias… and those found in other plants.”

These chloroplasts, which Jacobs said are also called iridoplasts because of their blue coloration, were found to possess a photonic crystal structure comprised of uniform layers that were only a few hundred nanometers thick, or approximately 1,000th the width of a human hair. The Bristol team, known that these structures were small enough to interfere with blue light waves, began to explore for a possible link between the chloroplasts and the blue color of the  begonia leaves.

They found that the structures appeared to be similar to artificial constructs often used to make lasers and other light-controlling photonic devices, and sure enough, performing the exact same optical measurements used on those components provided a great deal of insight into the natural structures found in the plants. They found that the crystal structures reflect blue light and absorb green light, useful since the trees that make up the forest canopy above them tend to absorb blue light and leave nothing but green light for the plant life on the rainforest floor.

In short, the begonias essentially evolved to become green light scavengers, and while that part of the mystery is solved, the researchers noted that there are still many questions left to answer. For instance, could this same design be used by scientists to modify crops and increase yields, or to improve electronic devices? And just how common is this adaptation in nature?

Whitney told Gizmodo that she believes that this trait could be “more common than currently thought. Several of the Begonia species that we know have these photonic irioplasts do not look visibly iridescent – and we know that a wide range of other plants produce similar structures, but haven’t been investigated yet.” She also noted that the research could help improve “solar energy capture” and “could serve as inspiration for future work” in that field.

—–

Image credit: University of Bristol

Is the universe’s expansion accelerating? These scientists say no.

Researchers who won the 2011 Nobel Prize in Physics for their groundbreaking discovery that the universe has been expanding at an accelerating rate may have been wrong after all, according to new research published last week in the peer-reviewed online journal Scientific Reports.

Using an analysis of Type Ia supernovae, a type of exploding star that is as heavy as the sun but smaller in size than the Earth, the two separate teams honored by the Nobel Committee reported that weaker-than-expected light detections were evidence that the expansion of the universe was accelerating, and that this acceleration was driven by a substance known as dark energy.

Those observations found 50 Type Ia supernovae that appeared to be less luminous than expect, but as part of their new study, Oxford physicist Professor Subir Sarkar and his colleagues looked at 740 such supernovae and found that the evidence supporting the hypothesis that the universe’s expansion is accelerating may not be quite as strong as the authors of the earlier work thought.

As Professor Sarkar explained in a statement, he and his co-authors “found that the evidence for accelerated expansion is, at most, what physicists call ‘3 sigma.’ This is far short of the ‘5 sigma’ standard required to claim a discovery of fundamental significance.”

“An analogous example… would be the recent suggestion for a new particle weighing 750 GeV based on data from the Large Hadron Collider at CERN,” he added. “It initially had even higher significance – 3.9 and 3.4 sigma in December last year – and stimulated over 500 theoretical papers. However, it was announced in August that new data shows that the significance has dropped to less than 1 sigma. It was just a statistical fluctuation, and there is no such particle.”

Scientists may have been ‘misled’ by Nobel Prize-winning discovery

While the professor admits that there is other evidence supporting the notion that the universe’s expansion is accelerating, including observations of the cosmic microwave background collected by the Planck satellite, he cautions that this data is “indirect, carried out in the framework of an assumed model, and the cosmic microwave background is not directly affected by dark energy.”

“So it is quite possible that we are being misled, and that the apparent manifestation of dark energy is a consequence of analyzing the data in an oversimplified theoretical model — one that was in fact constructed in the 1930s, long before there was any real data,” Professor Sarkar said. Adopting a new framework which does not assume that the universe is homogeneous and that its contents behave like ideal gases could explain all of these observations without the need for dark energy, he added.

“Naturally, a lot of work will be necessary to convince the physics community of this, but our work serves to demonstrate that a key pillar of the standard cosmological model is rather shaky,” the Oxford physicist concluded. “Hopefully this will motivate better analyses of cosmological data, as well as inspiring theorists to investigate more nuanced cosmological models.”

So is the universe’s expansion accelerating or not? The definitive answer may have to wait until 2024, when construction of the European Extremely Large Telescope (E-ELT) observatory is expected to be completed. Located in Chile’s Atacama Desert, the ultrasensitive E-ELT will use a “laser comb” and adaptive optics to directly measure the expansion rate of the universe over a 10- to 15-year period, according to the research team.

—–

Image credit: Thinkstock

Oxygen deprivation: Can it cure jet lag?

Researchers from the Weizmann Institute of Science in Israel and the UK’s University of Bristol believe they have discovered a way to overcome jet lag, or any disruption to the body’s circadian rhythms – temporarily reduce the amount of oxygen in the air that you breathe.

In research published last Thursday in the journal Cell Metabolism, assistant professor Dr. Gad Asher and his colleagues explained that mice breathing thinner air with 25%-33% lower oxygen content had an easier time adapting to a six hour time change than those breathing normal air.

Scientists have already found that the human body has circadian clocks regulated by eating food and temperature, NPR and the Los Angeles Times explained, and the new study revealed that the rodents consumed greater amounts of oxygen in darkness (their active phase) and less when they were exposed to light (which is when they rest, since the creatures tend to be nocturnal).

While it remains unclear at this time if altering the oxygen content of the air we breathe could be an effective way to overcome jet lag in humans, Dr. Asher’s team believes that airlines may want to try and alter the amount of oxygen that is in the air that passengers on their planes breathe.

Less oxygen equals faster recovery – but will it work on people?

As part of their research, Dr. Asher and his colleagues cultured mouse cells in laboratory dishes and exposed those cells to various amounts of oxygen to determine which genes were expressed, the Times explained. They found that those cultures exposed to air containing varied amounts of oxygen (between 5%  and 8%) became synchronized to a new circadian rhythm.

The study authors then got their test subjects used to a cycle of 12 hours of light followed by 12 hours of darkness. Once the mice were acclimated, they made a one-time adjustment to speed up the cycle by six hours, mimicking the impact of a flight from Chicago to London. In most cases, the mice breathed air with 21% oxygen content (which is also what humans breath in real-world conditions), but for the experiment, some of the subjects breathed air that was 16% oxygen.

The 16% oxygen mice were found to adapt to the new day-night schedule significantly quicker than their counterparts, the researchers found, based on observations of their eating, sleeping, and overall activity levels. Furthermore, reducing oxygen content to 14% for a two-hour time period helped the mice recover from their so-called “jet lag” even more quickly, the newspaper said.

As Dr. Asher explained to NPR, the next step is to determine whether this ability is exclusive to mice or applicable to humans and other organisms as well. If the latter is true, then the next step would be to determine if it would be best to tinker with the oxygen levels before, during or after a flight, and whether a single treatment would be sufficient or multiple ones would be required. Provided his team can find the answers, jet lag may soon become a relic of the past.

—–

Image credit: Thinkstock

Evidence of ongoing volcanic activity found on Venus

Six years after originally finding evidence suggesting that Venus was geologically active, new research presented earlier this week at the American Astronomical Society (AAS) Division for Planetary Sciences meeting indicates that lava may be flowing from one of its volcanoes.

According to ScienceNews and Astronomy Magazine, a team of planetary researchers from the German Aerospace Center (DLR) reviewed data from the ESA’s Venus Express spacecraft and found hotspots suggesting that the volcano known as Idunn Mons could currently be active.

Using its VIRTIS (Visible and Infrared Thermal Imaging Spectrometer) instrument, the orbiter mapped the planet’s southern hemisphere in the near infrared spectrum, and by using a numerical model to improve the limits of its data resolution, the DLR team analyzed anomalies detected on top of and in the eastern part of the 200 km volcano. They uncovered signs of volcanism.

“We could identify and map distinctive lava flows from the top and eastern flank of the volcano, which might have been recently active in terms of geologic time,” D’Incecco said in a statement. “With our new technique, we could combine the infrared data with much higher-resolution radar images from the NASA Magellan mission,” which orbited Venus from 1990 until 1992.

D’Incecco went on to explain that his team’s work represented “the first time that – combining the datasets from two different missions – we can perform a high-resolution geologic mapping of a recently active volcanic structure from the surface of a planet other than Earth.”

Discovery will likely help direct future Venus-bound missions

Long known to be a hellish landscape with far and away the hottest surface temperatures in the solar system and atmospheric pressure 92 times that found on Earth, the first evidence that Venus may still be geologically active was gathered by the Venus Express’ VIRTIS instrument in 2010.

At the time, the probe found several anomalies with elevated levels of emissivity (or an object’s ability to emit infrared energy), ScienceNews and Astronomy Magazine reported, suggesting that magma may have been flowing beneath the planet’s surface. It also found signs of weathering on warmer rocks, indicating that they were relatively new, geologically speaking.

Using their numerical technique along with the VIRTIS model and the Magellan mission data, D’Incecco and his colleagues conducted a computer simulation to determine how Idunn Mons might have generated the hot spots detected around it. They concluded that five lava flows, one atop the mountain and four running down the sides of the volcano, were likely the source.

Their findings will be used to help shape future missions designed to explore Earth’s so-called “sister planet,” including NASA’s planned Discovery VERITAS mission and the ESA EnVision M5 project, which will combine high-resolution radar with near-infrared mapping, DLR pointed out. Both of those missions are likely to be launched sometime within the next decade.

—–

Image credit: ESA

NASA orbiter finds Schiaparelli lander — and it doesn’t look good

The NASA Mars Reconnaissance Orbiter has discovered the landing site of the ESA ExoMars mission’s Schiaparelli lander, which lost contact with ground control personnel moments before it was scheduled to touchdown on the surface of the Red Planet late Wednesday morning.

Schiaparelli entered the Martian atmosphere shortly before 11am EDT on Oct. 19, after which it was to begin a six-minute automated descent. However, ESA mission personnel never received a signal from the spacecraft confirming that it had successfully made its way to the surface.

On Thursday, ESA officials revealed that they would review data recorded by its mothership, the Trace Gas Orbiter (TGO), to determine what might have happened during Schiaparelli’s descent, and on Friday, they revealed that the lander had been located thanks to images captured using the low-resolution CTX camera equipped on NASA’s Mars Reconnaissance Orbiter (MRO).

lander crash

This image comparison shows NASA’s view of the crash site.

The images were taken as the MRO was flying over the expected landing site in a plain located near the Martian equator informally known as Meridiani Planum as part of a planned campaign, the ESA explained in a statement. Taken in a resolution of 6 meters per pixel, they clearly show two new items that were not present in images captured back in May.

Investigation into the incident continues

One of the new features is believed to be the 12-meter parachute used during the second part of the lander’s descent, which would have been deployed following the initial heat shield portion of its entry. The chute should have been deployed just before the spacecraft’s thrusters slowed it to a near-standstill just meters above the surface during the final phase of the landing.

The second new feature, the ESA said, is a dark-colored fuzzy patch located about one kilometer north of the parachute. This 15 meter by 40 meter sized object is believed to have been caused by the impact of Schiaparelli itself, indicating that the lander’s planned freefall occurred at a higher-than-expected altitude, likely because the spacecraft’s thrusters prematurely stopped working.

“Estimates are that Schiaparelli dropped from a height of between 2 and 4 kilometers, therefore impacting at a considerable speed, greater than 300 km/h. The relatively large size of the feature would then arise from disturbed surface material. It is also possible that the lander exploded on impact, as its thruster propellant tanks were likely still full,” the ESA explained.

However, they noted that the analysis was ongoing, and that updates would be provided as new information is found. The features will be re-examined next week using the HiRISE, the MRO’s higher resolution camera, and they hope that those observations will reveal the whereabouts of the lander’s heat shield, which was ejected from a higher altitude. ESA officials noted that they are “confident” that they will be able to reconstruct the events leading up to impact, and they still hope that they will find out exactly want went wrong with the lander in the first place.

—–

Image credit: NASA

New long-necked dinosaur species discovered in Australia

The discovery of two sets of long-necked dinosaur fossils is shedding new light on how these massive creatures originally made their way to Australia approximately 100 million years ago, according to new research published online this week in the journal Scientific Reports.

Based on the remains, which were found in Queensland, the new species were classified as both sauropods (large plant eaters with long necks and tiny heads) and titanosaurs (making them some of the biggest dinosaurs ever to roam the Earth), BBC News and the Los Angeles Times said.

One of the two creatures, Savannasaurus elliottorum, is a previously undiscovered species that was named after the Elliott family, who discovered its fossils on their property while they were herding sheep. The creature’s skeleton was assembled from 17 pallets worth of bones encased in rock, and according to BBC News, the process took more than a decade to complete.

The other creature, Diamantinasaurus matildae, is the first Australian sauropod for which skull fragments had been discovered, the Times reported. Lead researcher Dr. Stephen Poropot of the Australian Age of Dinosaurs Museum and his colleagues said that the specimen’s discovery has allowed them to learn more about the creature’s skeletal anatomy.

Unique species key to plotting sauropods’ migratory path

Thanks to the discovery of these new fossils, Dr. Poropot’s team was able to determine exactly when and how titanosaurs and other sauropods made their way to Australia. As it turns out, they only arrived approximately 100 million years ago – far later than other dinosaurs arrived there – and they most likely traveled from South America, by land, across Antarctica.

According to National Geographic, the researchers determined that these sauropods place on the family tree “strongly suggests” that they were descended from South American ancestors. If that were true, they would have traveled to their new home by land, and the only path available at the time would have been Antarctica, which thanks to global warming, would have been ice-free.

Dr. Poropat with the dinosaur's vertebrae.

Dr. Poropat with the dinosaur’s vertebrae.

“By plotting the evolution of these sauropods against changes in the positions of the continents, we’ve possibly been able to constrain when these titanosaurs migrated,” Dr. Poropat explained to Nat Geo. However, he added that more analysis was required to fully understand these dinosaurs and that they now plan to comprehensively describe the specimens and confirm their species.

Savannasaurus has garnered the most interest, partially due to the fact that it is the new species, and partially due to the fact that it is rather unique among sauropods. As the authors explained, it was roughly 20 feet tall and weighed between 15-20 tons, but unlike other sauropods, it had very wide hips that probably gave it more stability. Furthermore, the creature’s bones were extremely thin in parts of its pelvis, and it likely had a sizable belly and complex digestive system.

—–

Image credit: Reconstruction by Travis R. Tischler / © Australian Age of Dinosaurs Museum of Natural History

How much time should kids spend on screens?

In a world dominated by smartphone games like Pokemon Go and powerful game consoles such as the PlayStation 4 and Xbox One, parents seeking tips on how much screen time they should let their children have were typically met with outdated recommendations.

Now, in an attempt to keep pace with an increasingly technology-dominated world, officials with the American Academy of Pediatrics released an updated set of guidelines during a conference in San Francisco held earlier this week, according to CNN.com and Los Angeles Times reports.

The new recommendations call for children under the age of 18 months to be permitted no screen time, and all other children under the age of 5 to be allowed a maximum of one hour per day, the media outlets said. Parents of older children are given greater leeway in determining the quantity of screen time that they are allowed, and some media uses do not count against that total.

“Children today are growing up in an era of highly personalized media use experiences,” a panel of AAP-approved experts wrote in guidelines to be published in the journal Pediatrics, according to the Times. “So parents must develop personalized media use plans for their children.”

“Families should proactively think about their children’s media use and talk with children about it, because too much media use can mean that children don’t have enough time during the day to play, study, talk, or sleep,” study author, Dr. Jenny Radesky, added in a statement. “What’s most important is that parents be their child’s ‘media mentor.’ That means teaching them how to use it as a tool to create, connect and learn.”

More Skyping, less TV viewing is ideal, recommendations say

The researchers recommended no screen time for very young children because they claim that it can be distracting to infants, potentially leading to sleep problems and a parent-child disconnect, according to CNN.com. They also recommend that breastfeeding mothers not use mobile devices or tablets while doing so, since it could cause the child to feel neglected.

For children between the ages of 2 and 5, the AAP recommends “creative, unplugged playtime” while noting that these youngsters can be introduced to screen time, but no more than one hour per day to begin with. They also suggest choosing shows like Sesame Street instead of cartoons on commercial networks, while giving a big thumbs-up to using technology like Skype to hold conversations with relatives, which they say can promote healthy development at this age.

Things get a little more complicated in children over the age of six. The guidelines state that a healthy child’s typical day should consist of school, doing homework, social contact, sleep and at least one hour of physical activity per day. Whatever time is left can be spent in front of a screen, the recommendations said, but such activities should never replace education, physical fitness or much-needed sleep, and parents should be sure to discuss potential technology-related risks like cyber-bullying and sexting with their sons and daughters.

Interestingly enough, the Times pointed out that fewer children are watching two hours or more of television per day than were doing so two decades ago, but it remains unclear is this is due to moms and dads enforcing recommended limits or because of the increasing availability of other types of screens, such as handheld game systems, smartphones and tablet computers.

The latter seems more likely, as the newspaper reported that smartphone use has risen from 52% of children under the age of 8 in 2011 to 75% for that demographic in 2013. This comes despite a lack of evidence that using educational apps on such devices provides any benefits to youngsters under the age of 2. The guidelines are important, the AAP said, due to the health risks associated with excessive screen time, namely obesity. For that reason, they have lowered the recommended amount of television viewing from 2 hours per day to just 90 minutes.

—–

Image credit: Thinkstock

The ESA has lost contact with its latest Mars lander

While a spacecraft designed to detect atmospheric gasses has successfully entered orbit around Mars, the status of a lander that it transported to the Red Planet remains unknown after officials with the European Space Agency (ESA) revealed that they had lost contact with it.

According to Space.com, the Trace Gas Orbiter (TGO), one of the two vehicles that are part of the joint ESA-Russian ExoMars 2016 mission, successfully slipped into orbit around Mars late Wednesday morning after completing one final, critical engine burn. ExoMars Flight Operations Director Michel Denis later confirmed that it was “at the right place” and in “good” shape.

However, the fate of the Entry, Descent & Landing Demonstrator Module (EDM), Schiaparelli, remains unknown this morning. Reports indicate that the spacecraft hit the Martian atmosphere shortly before 11am EDT on Wednesday morning, but ground control never received a signal to confirm that it had successfully made it to the surface.

Communication with the lander stopped shortly before it was supposed to complete a six-minute final decent, ESA mission operations department chief Paolo Ferri explained to Space.com. “It’s clear that these are not good signs,” he said, “but we will need more information,” adding that he was “confident” that the agency would find out what happened sometime on Thursday.

ESA officials could have up to 10 days to locate the lander

In a statement, the ESA said that Schiaparelli was programmed to perform an automated landing sequence following its release, including the deployment of a parachute and the release of a front heat shield once it reached a height of between seven and 11 km. Next, the lander was to slam on the brakes by firing its retrorockets once it reached 1100 m above the surface.

The landing sequence, which was to conclude with a free-fall from an altitude of two meters, was being monitored by the Giant Meterwave Radio Telescope in India, the agency said. The GMRT lost contact with Schiaparelli at some point during the landing sequence, however. A subsequent attempt to locate the probe using the Mars Express orbiter proved inconclusive, the agency said.

ESA officials expect to learn more on Thursday, once they have time to hunt for signals from the lander using GMRT, Mars Express, and NASA’s Mars Reconnaissance Orbiter (MRO) and Mars Atmosphere and Volatile Evolution (MAVEN) probes, Space.com said. If the lander was able to make it to the surface safely, its batteries should last for at least three days and perhaps as long as 10 days, giving officials multiple opportunities to attempt to reestablish contact.

If Schiaparelli was able to land successfully, it would be the first ESA lander – and, in fact, the first from any non-NASA space agency – to do so. If it did not, then it would be the latest entry on a long list of failed attempts, which includes the ESA’s own Beagle 2, which touched down on the Red Planet’s surface in 2003 but failed to deploy its antenna and solar panels.

The TGO mission, which will not begin science operations until late next year due to a complex series of aerobraking maneuvers needed to correct its orbit, will be creating a detailed inventory of the various gasses in the Martian atmosphere. The Schiaparelli lander, if healthy, will monitor the planet’s weather, although it was mostly intended to demonstrate the technology that will be used on a future life-hunting mission to the Red Planet.

—–

Image credit: ESA

Planet Nine may be causing the entire solar system to tilt

The existence of Planet Nine, the massive unconfirmed world located at the outer edge of the solar system, may be causing the sun to become tilted slightly, forcing its entire planetary system to “wobble” out of alignment, according to a soon-to-be-published new study.

While all of the currently confirmed planets in the solar system orbit the Sun in a flat plane within a few degrees of each other around the sun, the plane rotates at a six-degree tilt with respect to the sun, Caltech planetary astronomy professor Mike Brown explained Wednesday in a statement.

This phenomenon makes it appear as though the sun itself is tilted at an unusual angle, he noted, something which scientists have thus far not been able to explain. However, if there is a massive planet about 10 times the size of Earth with an orbit an average of 20 times farther from the sun than Neptune located in the outer reaches of the solar system, it could well be responsible.

Brown and his colleague Elizabeth Bailey, a graduate student at Caltech and lead author of the new study, believe that such a massive world could drastically alter the solar system’s physics by traveling at an orbit approximately 30 degrees off the orbital plane of the other planets. By doing so, it could also influence the orbit of a large number of Kuiper Belt objects as well.

Size, distance from the sun could make Planet Nine a disruptive influence

“Because Planet Nine is so massive and has an orbit tilted compared to the other planets, the solar system has no choice but to slowly twist out of alignment,” said Bailey, whose research was presented Tuesday as part of the American Astronomical Society’s Division for Planetary Sciences annual meeting in Pasadena and will appear in the Astrophysical Journal.

In fact, the proposed world’s purported influence on the Kuiper Belt is how Brown and fellow Caltech researcher Konstantin Batygin originally stumbled upon its existence, the study authors explained. The angular momentum (mass multiplied by its distance from the sun) of this planet could be having a disruptive influence on the normally smooth-spinning solar system.

“Planet Nine is the first thing that has been proposed to tilt the solar system that doesn’t depend on early conditions,” Bailey told Space.com, “so if we find Planet Nine, we will be able to see if it’s the only thing responsible for the tilt, or if anything else may have played a role.”

Since the other planets exist along a flat plane, the researchers noted, their angular momentum keeps the solar system in balance. The unusual orbit of the quite large Planet Nine, however, would add a several billion year wobble to the system, and based on its hypothesized size and distance from the sun, a six-degree title would be a perfect fit. The question that remains to be answered is: how would Planet Nine have acquired such an unusual orbit in the first place?

While additional research is needed to definitively solve that mystery, Batygin hypothesizes that Jupiter may have ejected it from the region of the solar system occupied by the gas giants, or that it could have been influenced by the gravitational pull of other objects in the distant past. For the time being, the Caltech researchers continue to comb the skies in search of evidence that the new planet is traveling along its projected path – a search that could take several years.

—–

Image credit: Caltech/R. Hurt (IPAC)

NASA announces the next target for New Horizons

Having just departed the reddish-looking dwarf planet Pluto, NASA’s New Horizons spacecraft is en route to another object that is similarly colored, officials at the US space agency announced earlier this week at an American Astronomical Society conference in Pasadena, California.

In fact, 2014 MU69 – one of 11 Kuiper Belt objects that the probe is scheduled to study over the next several years – is “even redder than Pluto” but “not quite as red as Mars,” according to New Horizons team member Amanda Zangari from the Southwest Research Institute in Colorado.

Zangari presented her team’s findings at the AAS Division for Planetary Sciences and European Planetary Science Congress on Tuesday. They determined its color using observations collected by the Hubble Space Telescope, Space.com said, and while the exact size of the estimated 13-25 mile (21-40 km) wide object is not yet determined, it is now the tiniest Kuper Belt object to have its color determined in this way, the SwRI-led team revealed.

Knowing that 2014 MU69 is reddish in color is more than a novelty, Zangrari explained during the conference – it “tells us the type of Kuiper Belt object 2014 MU69 is.” This newfound data “confirms” that when New Horizons completes a flyby of the object on January 1, 2019, it “will be looking at one of the ancient building blocks of the planets.”

Finding suggests that 2014 MU69 is part of the ‘cold classical’ region

During that flyby, the spacecraft will travel to within 1,860 miles (3,000 km) of the 2014 MU69, Space.com said. Thanks to the discovery of its color, scientists now know that they will probably be looking at a member of the Kuiper Belt’s “cold classical” region, which is home to primordial bodies that have undergone little significant change since the birth of the solar system.

Like the red coloring discovered on Pluto and its moon Charon, the reddish hue observed at 2014 MU69 suggests the presence of tholin, Gizmodo noted. Tholin is a type of molecule which forms when organic compounds like methane and ethane are exposed to ultraviolet radiation. Although it does not form naturally on Earth, it is abundant on the icy objects found on the outskirts of the solar system, the website added.

2014 MU69 is located approximately 1.6 billion miles (2.6 billion km) past Pluto, while the New Horizons spacecraft is currently 340 million miles (540 million km) beyond the dwarf planet and 3.4 billion miles (5.5 billion km) from Earth. It is about 600 million miles (nearly one billion km) from its next research target, having covered approximately one-third the distance separating the new target from Pluto, NASA officials confirmed.

The spacecraft is speeding away from the center of the solar system at a velocity of nearly nine miles (14 km) per second, the agency added. To date, it has transmitted roughly 99% of the data from its Pluto encounter back to scientists here on Earth, including possible cloud sightings in its atmosphere. If those observations are confirmed, “it would mean the weather on Pluto is even more complex than we imagined,” said Alan Stern, principal investigator at SwRI.

—–

Image credit: NASA

This year is the hottest year ever recorded, NASA says

By the narrowest of margins, last month was the hottest September in recorded history, NASA data has revealed, meaning that 11 of the last 12 months have seen record-breaking heat and all but assuring that 2016 will go down in the books as the warmest year since at least 1880.

According to a monthly analysis of global temperatures by scientists with the US space agency’s Goddard Institute for Space Studies (GISS) in New York, September 2016 edged out September 2014 by just 0.004 degrees Celsius, placing the two months in a statistical tie. However, the past four week period was 0.91 degrees Celsius warmer than the mean from 1951-1980.

Since October 2015, 11 of the past 12 months have shattered monthly high-temperature records, with the lone exception being June 2016, according to GISS. While previous reports had claimed that June 2016 was the warmest such month in the 136-year history of modern record keeping, it was actually just the third hottest behind 2015 and 1998, based on updated climate data.

“Monthly rankings are sensitive to updates in the record, and our latest update to mid-winter readings from the South Pole has changed the ranking for June,” GISS director Gavin Schmidt explained in a statement. “We continue to stress that while monthly rankings are newsworthy, they are not nearly as important as long-term trends.”

Disagreement in monthly data, but annual record still likely to fall

However, as reported by the Huffington Post, Schmidt also tweeted that the data for September suggests that 2016 will almost certainly be the hottest ever recorded, as it has been approximately 1.25 degrees Celsius above the late 19th century mean through the year’s first nine months.

Should that happen – and barring a much-colder-than-average fourth quarter of the year, it will – it will be the third consecutive year that the record for hottest annual temperature was shattered, according to CNN.com reports. Average temperatures for 2015 were 0.90 degrees Celsius higher than the 20th-century average and 20% higher than the benchmark set the previous year.

Provided 2016 set the record, it would mean that 16 of the 17 hottest years ever recorded would have come since 2000, with 1998 being the lone exception. The last time that the planet shattered the record for the coldest year on record was 1911, the website noted. A thirdconsecutive record year would “confirm the longer term trends of climate change,” The Guardian said.

However, not all climate-recording agencies are in agreement. In fact, the US National Oceanic and Atmospheric Administration (NOAA) is reporting that September 2016 was not the hottest month of its kind in recorded history. They claim that it was actually 0.04 degrees Celsius cooler than the previous September, which snapped a string of 16 straight record-setting months based on their data. Nonetheless, they say that 2016 is likely to be the hottest year ever.

“If each month from October through December matches the 1998 monthly values… 2016 would become the second warmest year on record, behind 2015 by 0.03°C (0.05°F),” NOAA wrote. But if those months matched “the 21st century monthly average,” they explained, “[then] 2016 would become the warmest year on record, surpassing 2015 by 0.01°C (0.02°F).”

—–

Image credit: NASA/NOAA

The Best New Medical Treatments for Fibromyalgia

Shutterstock

Shutterstock

Thanks to an overactive central nervous system, extra chemicals in your spinal fluid, or the theory of the day, sensitivity to medications and fibromyalgia seem to go hand-in-hand. As if that wasn’t enough, the medications often exacerbate the debilitating issues you’re trying to get rid of while alleviating only one or two of the others.

To Medicate or Not to Medicate

Sensitivity to medications usually means that one is in for a frustrating process of trial and error. So it’s important to weigh the pros and cons. If the pain, fatigue, and litany of symptoms are debilitating enough to bow down to the pharmaceutical overlords, then absolutely go for it and be prepared for a combination of medications to treat a variety of symptoms. Remember, doctors know that fibromyalgia patients are sensitive to medications so they generally start you out on a low dose and closely monitor you for worsening effects. Best case: the medications will work well for you and bring much needed relief so that you function more like you wish to. Worst case: the medications won’t work and may make you feel worse temporarily. But you won’t know till you try and the prospect of finding a reprieve may feel like enough of an incentive to give it a shot. So let’s take a look at some of the best medical treatments for fibromyalgia that Western medicine has to offer at this moment.

Lyrica, Cymbalta, and Savella! Oh My!

Known by their brand names, Lyrica, Cymbalta, and Savella are the most popular prescription medications used to treat fibromyalgia. The upside to these is that they have provided relief of symptoms for many people. The downside is that 1.) they don’t work for everyone, 2.) the side effects can worsen other symptoms like IBS, fatigue, and depression, and 3.) no one knows how they work.

Yeah, the last one is just down-right creepy. But take comfort in knowing that these and similar medications have been studied for decades in order to enhance and perfect them. Furthermore, we may not actually know exactly how photosynthesis works, but we eat leafy greens nonetheless.

“The last downside? What about the second one?!” you shout at me from your monitor. Yes, there’s that. These medications can indeed worsen certain symptoms that are already debilitating aspects of fibromyalgia. Lyrica, for example, can help with pain management. But its side effects include sleepiness and trouble concentrating, among other things. Cymbalta can be used to treat depression and peripheral neuropathy, but can also lead to nausea, sleepiness, constipation, and even increase depression. Savella, while it is actually the first drug used specifically for treating fibromyalgia, acts like an antidepressant which means side effects can include nausea, sleepiness, insomnia, constipation, and more.

So for fibromyalgia patients dealing with constant fatigue, IBS, depression, and concentration (to name a few symptoms), the prospect of trying medications that could possibly exacerbate them may seem like a waste of time or cause anxiety just at the thought of it. But don’t panic. Side effects of prescription medications are nothing new. Again, weigh the pros and cons and see what is best for you. Only you can make that determination. For some more information about these popular medications, visit the Food and Drug Administration website.

But You Said “New” Treatments in the Title

The National Fibromyalgia and Pain Association says to keep an eye out for a drug called Effirma which is already used to treat chronic pain in other countries, but is still in the trial phase in the United States. It is apparently between Phase II and Phase III of the process. The other drug they highlight is manufactured by the same company as Effirma, but works very differently and is currently known as TNX-102. An under-the-tongue tablet for rapid delivery, TNX-102 has been used as a muscle relaxant, but has also been effective in musculoskeletal pain, pain sensitivity, and fatigue. Learn more.

What If I’m Just Too Sensitive for Prescription Medications?

It seems like nearly every ailment on the planet can be mitigated by a little bit of exercise. “But I’m too tired to exercise,” you protest. “I couldn’t even get to the mailbox today.” That’s ok. Some days are better than others, so on those “better” days, the Mayo Clinic recommends moderate low-impact aerobic activity, such as walking, to allay pain and stiffness. This is not a recommendation to become an ultra-marathoner. But sustained walking, yoga, tai chi, and other low-impact physical activity can often do a much better job than prescription drugs in the long-term.

These are not blanket statements. Each body is a different body. Every person responds differently to different treatment, even identical twins. Try things. See what works for you and listen to others in your shoes in case they stumble across something you’ve never heard of. But just deal with things one moment at a time. The next moment will have its own issues.

8 amazing yoga poses for fibromyalgia relief

Yoga for fibromyalgia

If you have talked to your doctor about fibromyalgia then in all likelihood one of the first things they recommended was that you do your best to stay active and create a regular exercise routine. But as anyone with fibromyalgia knows, exercise sucks! Your muscles are sore, your joints ache, and your just too fatigued to care about anything as miserable as exercising. Well the solution to this dilemma can be fixed by a simple, four letter word: yoga.

The most prominent symptoms of fibromyalgia, muscle pain and fatigue, are only exacerbated when you don’t find a way to exercise. The old expression “Use it, or Lose it” is true. Spending all day in bed leads to muscle atrophy and weight gain which means it will only hurt all the more whenever you do try to move around.

However finding the energy to hit the gym or go for a jog is beyond most fibro sufferers. This is where yoga comes in. The simple stretches of a light yoga routine are designed to relieve muscle tension and soreness, the exact problems that people with fibromyalgia face. Yoga is also known to be an excellent way to calm your mind and reduce stress, one of the leading causes of the eternally dreaded fibro-flares.

If you are suffering from fibromyalgia and haven’t yet tried yoga, we recommend that you give it a shot. While we are no substitute for a full yoga class or instructional video, these 8 easy, beginner yoga poses will allow you to experiment a little and decide what’s right for you.

 Mountain Pose

yoga for fibromyalgia

Image: Shutterstock/fizkes

It seems simple right? Hardly like stretching at all? Well there is a lot going on behind the scenes of this basic pose. All your attention is directed towards grounding yourself with the earth. Your shoulders, spine, and breathing are all in alignment. Feel your shoulders pulled down and your spine lengthen as you calmly breathe in and out. Stress decreases and you feel a relaxation in our muscles and organs.

Learn more about Mountain Pose.

Standing Forward Fold

yoga for fibromyalgia

Image: Shutterstock/fizkes

This pose is known for its incredible calming effects. It also opens up the entire back of your body. It is also easy to modify depending on your level of pain and flexibility. If this pose is too challenging at first, try placing your hands against a wall for support while performing it.

Learn more about Standing Forward Fold.

Corpse Pose

yoga for fibromyalgia

Image: Shutterstock/Summersky

Morbid name aside, this pose is a great one to end any yoga session on, but it can also be done at any time you need to calm your mind or relax your body. Stretch your arms and legs out flat and focus your thoughts inward. The greatest strength of this posture is how it teaches us to ignore our surroundings and simply exist in the moment.

Learn more about Corpse Pose.

Warrior I

yoga for fibromyalgia

Image: Shutterstock/fizkes

The Warrior I pose puts the focus on muscles strength, specifically the large muscles of the legs, back, and arms. It also allows you to calm your mind. This pose is an excellent way to maintain your strength and is perfect for those with fibromyalgia.

Learn more about Warrior I.

Legs-Up-the-Wall

yoga for fibromyalgia

Image: Shutterstock/fizkes

This pose is known as an inversion and sets your body in the opposite direction of our typical posture. The muscles in our legs and hips are given a chance to stretch out and relax. Your inverted posture will also cause your blood flow in your legs to reverse which can reduce swelling and fatigue. If this pose is challenging at first, try placing a folded blanket underneath your hips to offer you some support.

Learn more about Legs-Up-the-Wall.

Cobra

yoga for fibromyalgia

Image: Shutterstock/ShutterDivision

The Cobra pose opens the chest and front of the body as well as strengthing and stretching your back, two of the most sensitive areas for those with fibromyalgia. This pose should be eased into until you feel completely ready for it. Start by placing your palms beside your chest and breathe with your forehead still on the floor. Gradually lift yourself up, and only go as far as your body will allow.

Learn more about Cobra.

Child’s Pose

yoga for fibromyalgia

Image: Shutterstock/f9photos

This pose is meant to turn your thoughts inward and quiet your mind. By this pose’s very nature, outside stimulus is eliminated, allowing you to focus on your breathing. If would like to include some light stretching in this pose, work on rounding your back, stretching your shoulders out, or stretching our arms forward.

Learn more about Child’s Pose.

Bound Angle Pose

yoga for fibromyalgia

Image: Shutterstock/fizkes

This pose is great for opening the hips and strengthing the knees and groin. Like many other poses though, when first beginning this pose it is important to gradually ease into it. Start with your legs extended in front of you and then slowly open your hips and bend your knees. With practice, you will be able to bend more and open up your muscles even further.

Learn more about Bound Angle Pose.

Team claims they’ve discovered secret chambers in the Great Pyramids of Giza

Using multiple different types of scanning technology, a group of researchers may (or may not) have discovered cavities in two separate areas of the Great Pyramid of Giza – presenting at least the possibility that previously undiscovered chambers exist in the ancient structure.

According to CNET and ScienceAlert, officials with the  Scan Pyramids project, a multinational effort to study the Great Pyramid (also known as the Khufu Pyramid) involving scientists from a variety of different universities and institutions, used infrared thermography, 3D simulation, and muon radiography imaging (a scan similar to an X-ray) to analyze the monument.

They discovered anomalies at two separate locations in the Pyramid: one located behind a gate in its north face and the other roughly 345 feet (105 meters) above the ground in the northeast edge of the structure. Both purported cavities are located far from the tombs and the main pathways in the structure, and there is no evidence that they are linked in any way, the reports said.

In a press release issued this past weekend, researchers said that they were able to “confirm the existence” of a “void” behind the north face of the Pyramid, and that they were still investigating to determine its exact size, shape, and position. Likewise, they said that they were able to confirm that there was “an unknown cavity” along the monument’s northeast edge.

Officials overseeing the project calling for additional analysis

Their scans, they explained, revealed “significant excess of muons” in the direction of the north face anomaly, and that further analysis determined that the “excess of muons, which could be interpreted as a void, was not statistical fluctuation or noise.” They added that the readings were in “the shape of a straight line,” which “strongly suggests… one or several voids.”

Similar techniques were used to detect the apparent cavity in the northeast edge of the Pyramid, which is also said to be of currently undetermined size and shape. While some might be tempted to suggest that the discovery is conclusive proof of that the monument contains secret chambers, some experts are skeptical of the team’s findings, according to Live Science.

The results, the website reported on Monday, “are more ambiguous,” and even the man in charge of overseeing the project, Egyptian antiquities minister Zahi Hawass, is apparently not yet totally convinced that the team has discovered actual cavities behind the monument’s walls. In actuality, he and his colleagues on the Scan Pyramids oversight team are urging caution, releasing a second press release to emphasize that additional work needs to be done to confirm the finding.

In that statement, they recommended extending the project by an additional year, and took care to refer to the discoveries “anomalies” instead of “cavities” or “voids,” Live Science explained. Hawass went on to explain to them that the results could actually be triggered by different sized stones used to build the monument, and does not necessarily indicate the existence of a hidden, room-sized chamber in either location.

—–

Image credit: Thinkstock

Researchers accidentally discover how to make ethanol using CO2

Scientists at the Oak Ridge National Laboratory (ORNL) have apparently accidentally found an efficient and inexpensive way to produce ethanol by using a copper nanoparticle catalyst to turn the greenhouse gas carbon dioxide into the renewable, alcohol-based fuel source.

According to New Atlas and Popular Science, the Tennessee-based research team found that by taking their copper nanoparticles catalysts and tiny spikes of carbon, and applying low levels (1.2 volts) of electricity, they could convert CO2 suspended into water into usable ethanol fuel.

Applying the voltage triggered a complex chemical reaction that effectively reversed the process of combustion at room temperature and with relatively little difficulty. Furthermore, their method resulted in the production of ethanol with an initial yield of 63%, the ORNL team reported.

“We discovered somewhat by accident that this material worked,” Adam Rondinone, lead author of a new paper detailing the process published in a recent edition of the journal Chemistry Select, explained in a statement. “We were trying to study the first step of a proposed reaction when we realized that the catalyst was doing the entire reaction on its own.”

“We’re taking carbon dioxide, a waste product of combustion, and we’re pushing that combustion reaction backwards with very high selectivity to a useful fuel,” he added. “Ethanol was a surprise – it’s extremely difficult to go straight from carbon dioxide to ethanol with a single catalyst.”

Technique could boost efforts to switch to renewable fuel sources

In actuality, Rondinone and his colleagues were attempting to convert carbon dioxide that had been dissolved in water into methanol, a chemical that is naturally produced by volcanoes and by various types of microbes, according to Popular Science. Much to their surprise, however, they wound up producing ethanol, which is useful as a renewable source of engine fuel.

One of the most important aspects of the process, the researchers explained, was the fact that it uses relatively common materials rather than exotic ones such as rare metals, focusing instead on arranging them into a specific nanoscale structures that limit unwanted side reactions. Since their technique used low-cost materials and can be completed at room temperature, the authors believe that it could be scaled up for use in various industrial applications.

For instance, the method could help store excess energy produced by wind or solar power units, capturing it when too much is generated and allowing it to be burned later when there is no wind or sunlight. “A process like this,” Rondinone said, “would allow you to consume extra electricity when it’s available… [and] help to balance a grid supplied by intermittent renewable sources.”

He and his ORNL colleagues Yang Song, Rui Peng, Dale Hensley, Peter Bonnesen, Liangbo Liang, Zili Wu, Harry Meyer III, Miaofang Chi, Cheng Ma and Bobby Sumpter are now on the lookout for ways to improve the ethanol production rate and to learn more about the behaviors and properties of their copper/carbon catalyst, according to New Atlas.

—–

Image credit: Thinkstock

Problems with Juno’s engines change Jupiter mission plans

A maneuver designed to reduce the Juno spacecraft’s orbit around Jupiter from 53.4 days to 14 days has been postponed due to issues with a pair of helium check valves in the probe’s primary engine which are part of its fuel pressurization system, NASA officials have announced.

According to Engadget and Ars Technica, mission control personnel at the US space agency’s Jet Propulsion Laboratory (JPL) in Pasadena, California entered a series of commands late last week when they discovered via telemetry that the valves were not functioning correctly.

“Telemetry indicates that two helium check valves that play an important role in the firing of the spacecraft’s main engine did not operate as expected during a command sequence that was initiated yesterday,” Rick Nybakken, Juno project manager at JPL, explained in a statement. “The valves should have opened in a few seconds, but it took several minutes. We need to better understand this issue before moving forward with a burn of the main engine.”

As a result, the orbital period reduction maneuver that had been scheduled to take place on Oct. 19 has been postponed for one orbital cycle, meaning that the maneuver will not take place until at least Dec. 11 – which likely means that Juno will not be able to complete the 36 flybys of the gas giant that it was originally schedule to perform over the next 20 months.

Scientific observations should not be affected, according to NASA

The maneuver’s postponement has also spurred NASA officials to make other changes to the spacecraft’s schedule. While mission planners originally intended to limit the number of science instruments being used during the Oct. 19 flyby of Jupiter, the plan now is to have the entire suite gathering data as it orbits the gas giant, JPL noted.

“It is important to note that the orbital period does not affect the quality of the science that takes place during one of Juno’s close flybys of Jupiter,” said Scott Bolton, Juno principal investigator at the San Antonio-based Southwest Research Institute (SwRI). “The mission is very flexible that way. The data we collected during our first flyby on August 27th was a revelation, and I fully anticipate a similar result from Juno’s October 19th flyby.”

The period reduction maneuver is scheduled to be the final firing of the spacecraft’s main Leros 1b engine, which propelled it from its Aug. 5, 2011 launch through its arrival at Jupiter on July 4. Once the maneuver is complete, Juno will be powered by small, onboard thrusters for the rest of its mission, during which scientists hope to better understand the origins of the gas giant, map its magnetic field, observe its auroras and search for a potentially solid planetary core.

In late August, the probe successfully executive the first of its planned orbital flybys, capturing and beaming back the first-ever images of Jupiter’s north pole. Those images, NASA explained in a press release issued in September, revealed that the region was home to storm systems and weather activity unlike anything ever seen around a gas giant in our solar system.

“First glimpse of Jupiter’s north pole, and it looks like nothing we have seen or imagined before,” Bolton said at the time. “It’s bluer in color up there than other parts of the planet, and there are a lot of storms. There is no sign of the latitudinal bands or zone and belts that we are used to – this image is hardly recognizable as Jupiter. We’re seeing signs that the clouds have shadows, possibly indicating that the clouds are at a higher altitude than other features.”

—–

Image credit: NASA

Has the Great Barrier Reef died? Not quite.

Literary giant Mark Twain once said, “The reports of my death have been greatly exaggerated,” and according to scientists, the same is true when it comes to the 1,400-mile-long Great Barrier Reef in Australia – despite the recent publication of its obituary by Outside Online.

In that piece, author Rowan Jacobsen mourns the loss of the 25-million-year old community of coral reefs in the Pacific Ocean, which he referred to as “one of the most spectacular features on the planet,” to the effects of climate change and ocean acidification following “a long illness.”

“For most of its life, the reef was the world’s largest living structure, and the only one visible from space,” Jacobsen wrote in the piece. “In total area, it was larger than the United Kingdom, and it contained more biodiversity than all of Europe combined… [and] among its many other achievements, the reef was home to one of the world’s largest populations of dugong and the largest breeding ground of green turtles.”

Despite being declared a UNESCO World Heritage Site in 1981, he added, it experienced mass bleaching events regularly, and by the turn of the century, the waters surrounding it had become so acidic that they began to dissolve the reef itself. Ultimately, Jacobsen noted, it succumbed to these events – only, according to other scientists, the Reef isn’t quite dead yet.

‘Not too late’ to save the Reef, despite recent bleaching events

“It’s not too late for the Great Barrier Reef, and people who think that have a really profound misconception about what we know and don’t know about coral resilience,” Dr. Kim Cobb from the Georgia Tech School of Earth and Atmospheric Sciences told the Los Angeles Times Friday.

“We just had a massive bleaching event, but we know from past research that corals are able to recover from the brink of death,” she said. Coral, the professor explained, is an animal that lives in a symbiotic relationship with photosynthetic algae. When the water becomes too warm, that algae becomes “chemically destructive” to the coral.

This causes the coral to spit out the algae in order to protect itself, causing the transparent coral to lose all of its color in the process. “So you are not necessarily seeing dead coral, you’re really just seeing clear coral without its algae,” Dr. Cobb said. However, she warned that bleaching is still harmful to the coral, as losing its food source could cause it to “starve to death.” Preventing that requires the water temperature to cool down, and to do so relatively quickly.

Russell Brainard, the head of the Coral Reef Ecosystem Program at the NOAA’s Pacific Islands Fisheries Science Center , told the Huffington Post that he believed that the article was written so that it would drive home the urgent danger facing the Reef, but that there was the risk that some people would “take it at face value that the Great Barrier Reef is dead.”

Earlier this year, researchers from the Australian Research Council’s Centre of Excellence for Coral Reef Studies reported that a recent bleaching event had caused serious damage to 93% of the Reef, and on Thursday, scientists found that more than one-fifth of the coral died as a result of that incident. While Brainard said that the bleaching event was a “severe blow” to the Reef’s health, he added that “we’re very far from an obituary” for the Heritage site.

—–

Image credit: Public Domain

Family finds rare fossil of 100-million year old swordfish in Australia

Two families visiting a free fossil-finding site in northwestern Queensland, Australia while on vacation have discovered “extremely rare” fossils belonging to a swordfish-like creature which lived approximately 100 million years ago, various media outlets reported this week.

According to the North Queensland Register, the complete snout of the creature identified as a Australopachycormus hurleyi, a nearly 10 foot (3 meter) long ray-finned fish, was discovered at the hunting ground by the Johnston family. Then, one week later, another family discovered the complete skull, teeth, vertebrae, and front fins of the creature at the same location.

“At first we thought [the snout] was a tooth from some giant reptile, since it was so large and cone shaped,” Mirjam Johnston told the newspaper. “It wasn’t until… we showed the bone to a fossil enthusiast at our camp site that we realized it was the tip of a very pointy fish nose.”

“I wasn’t expecting to find something so complete. I remember pulling up the layers of rock and realizing there was bone poking out everywhere,” added Tony Amos, who discovered the rest of the creature’s remains along with his wife, Gail. The Amos family got in contact with officials at Kronosaurus Korner, a local fossil museum that helped identify the specimen.

Discovery highlights the importance of amateur fossil-hunters

Dr. Patrick Smith of Kronosaurus Korner told BBC News that Australopachycormus hurleyi was “a high-tier carnivore” that “ate other large fast-moving fish, a bit like marlin do today… Because it does fit that swordfish-like shape we know he probably lived in that same ecological niche.”

He told the Register that the creature likely used its pointed snout to slash or stun the fish that it hunted, and that despite its similarity in appearance to swordfish, it was actually a member of an extinct genus of ray-finned, Jurassic-period fish known as a pachycormid. The fossils, which are now on display at the museum, as “special” because they are “so complete,” he added.

“Fossils of Australopachycormus are exceptionally rare, which is demonstrated by the fact that the species was only discovered less than a decade ago,” Dr. Smith told ABC News. “Previous to this find we had no near-complete remains of the animal in our museum,” he noted, emphasizing that without the help of the two families which found the fossils, “specimen such as this… could easily [have] been lost or destroyed.”

In light of that fact, the Kronosaurus Korner curator encouraged other citizen paleontologists to visit the fossil-hunting site where the remains were unearthed, which is located about 1,000 miles (1,700 kilometers) from Brisbane, north of Richmond, Queensland. The region, which has been called Australia’s Dinosaur Trail, has been home to fossil discoveries for more than 80 years, he told BBC News.

—–

Image credit: Kronosaurus Korner

Scientists make plans for ‘Asgardia’– the first nation in space

With a contentious Presidential election currently ongoing in the US and the looming threat of terrorism in all parts of the world, sometimes it might seem as though being a citizen of Earth is highly overrated, but short of discovering a new habitable planet, what are the options?

Well, according to BBC News, you might soon be able to become a citizen of Asgardia, which seeks to become an officially-recognized new pacifist nation in orbit. The unusual project is the brainchild of a Russian scientist and businessman named Dr Igor Ashurbeiyli, who is overseeing the project as head of the Vienna-based Aerospace International Research Center.

The proposed space nation, which was named in honor of the Norse realm home to the Aesir tribe of gods, intends to launch its first satellite in 2017 and is hoping to receive official recognition as a sovereign country by the UN, the British media outlet noted. Nearly 100,000 men and women have reportedly already signed up to be citizens of the new nation.

In a speech earlier this week, Dr. Ashurbeyli called his proposed endeavor “a global, unifying and humanitarian project,” adding that it would be “a fully-fledged and independent nation, and a future member of the United Nations – with all the attributes this status entails: a government and embassies, a flag, a national anthem and insignia, and so on.”

Furthermore, he explained that part of the mission would be to create a country in space so that Earth-based conflicts and wars would not affect it; that it would look to safeguard “the peaceful use of space” and protect the planet below from threats such as sun storms, flares, asteroids and comets; and that it would be “a demilitarized and free scientific base of knowledge in space.”

How feasible is the Asgardia project, and is it even legal?

The catch, Dr. Ashurbeyli explained in an interview with The Guardian, is that the residents of Asgardia would actually still live on Earth. Physically, they would reside in the same countries which they always called home, while being “citizens” of Asgardia at the same time. Once they receive more than 100,000 applicants, they will apply to the UN for state recognition.

The scientist said that he is well aware that the whole thing sounds far-fetched, and according to BBC News, he even joked that he would not be surprised if the media labeled him a “crazy” man who is talking “utter nonsense.” Nonetheless, he remains committed to the project – his company is currently even holding competitions to design a flag and find an official national anthem.

One problem, experts have pointed out, is that international law prohibits claims of sovereignty in space, meaning that it is unlikely that Asgardia would ever be officially recognized by the UN. As London Institute of Space Policy and Law director Professor Sa’id Mosteshar explained to BBC News, the globally-ratified Outer Space Treaty clearly says that “no part of outer space can be appropriated by any state.”

Christopher Newman, an space law expert from the University of Sunderland, said that the plan was “an exciting development in many ways,” he told The Guardian that there were “formidable obstacles in international space law for them to overcome. What they are actually advocating is a complete re-visitation of the current space law framework.”

—–

Image credit: Asgardia

There are ten times more galaxies in the universe than we thought

Scientists have long believed that there were approximately 200 billion galaxies in the known universe, but as it turns out, they were wrong – new surveys compiled using the Hubble Space Telescope have revealed that there are at least 10 times more than previously thought!

According to CNN and the Los Angeles Times, researchers from the University of Nottingham led by astrophysicist Christopher Conselice compared older Hubble photos with new ones, and used mathematical models to infer how many galaxies there are that we are unable to see using our existing observational equipment.

While images captured with the space telescope 20 years ago determined that there must be at least 100 billion galaxies in the universe, and around 200 billion by some estimates, the research team has determined that there are approximately two billion galaxies out there, 90% of which are too faint or far away to be seen with modern-day telescopes.

“We now know that there are at least 10 times more galaxies in the universe than we had thought for the last 20 years, and before that we didn’t really have any idea,” Conselice, lead author of a new study accepted for publication in The Astrophysical Journal, explained to the Times. “So the more we learn about the universe… the more interesting it becomes.”

90% of all galaxies have yet to be studied, according to the authors

The researchers used Hubble’s Wide Field Camera 3, which was able to observe the universe in near-infrared wavelengths to create a 3D map of space dating back nearly 13 billion years. They found that the early universe was filled with a tremendous amount of smaller galaxies which had lower masses than the larger galaxies of today.

Over time, those galaxies merged into larger galaxies and their overall population density started to dwindle, the study authors explained. Their findings suggest that, while the overall mass of the universe remains unchanged, galaxies themselves have not been evenly distributed throughout the history of the cosmos.

“These results are powerful evidence that a significant galaxy evolution has taken place throughout the universe’s history, which dramatically reduced the number of galaxies through mergers between them – thus reducing their total number,” Conselice said in a statement. “This gives us a verification of the so-called top-down formation of structure in the universe.”

“It boggles the mind that over 90 percent of the galaxies in the universe have yet to be studied,” he added. “Who knows what interesting properties we will find when we discover these galaxies with future generations of telescopes? In the near future, the James Webb Space Telescope will be able to study these ultra-faint galaxies.” In fact, Conselice told the Times, the Webb telescope will “more than double the number of galaxies that we can see today.”

Furthermore, their research helps explain a phenomenon known as Olbers’ paradox, which asks why the sky is dark at night if the universe is home to an infinite amount of stars. The authors of the new study have determined that, while every part of the sky does contain part of a galaxy, the light from those galaxies is invisible to the human eye due to its absorption by intergalactic dust and gas, and other factors that make the sky appear dark.

—–

Image credit: NASA

Moon impacts are more common than we thought, study finds

The moon might be one of the most well-studied objects in space, but that doesn’t mean that we know all of its secrets just yet, as demonstrated by a newly-published research paper that reveals the lunar surface contains far more craters than researchers had previously predicted.

Comparing before-and-after images of the same basic region of the moon captured at different times by the Lunar Reconnaissance Orbiter (LRO), senior researcher Emerson Speyerer and his Arizona State University colleagues discovered 33% more craters than had been detected earlier, according to New Scientist and Engadget reports published on Wednesday.

Moon impact craters

An impact formed between October 2012 and April 2013. Credit: NASA/GSFC/Arizona State University

What that means, Speyerer’s team explained, is that the small meteors that constantly bombard the moon are constantly forming new craters and impact basins which could pose a serious threat to any human settlements or refueling stations constructed there in the foreseeable future.

“If you are an astronaut sitting on the surface, you don’t necessarily have to worry about being directly hit by a meteorite,” he told New Scientist. “But you would have to worry about all these secondaries, that are coming from kilometers and kilometers away.” The LRO researchers have published their findings in this week’s edition of the peer-reviewed journal Nature.

Observing the moon’s regolith evolve in real-time

Speyerer and his colleagues compared 14,000 images collected by the orbiter at different times and counted 222 new impact craters at least 10 meters in width. They also found 47,000 new splatter-like changes or “splotches” in reflectance on the lunar surface – changes that are the result of dust and rock aftershocks that follow the initial crater-forming impact.

“That’s unique to the moon. That’s forming from the primary impact throwing out this ejecta, and that’s what makes these images really unique,” the study author told New Scientist. He added that the largest new crater was 43 meters in diameter and the smallest were 10 meters in diameter (which is the smallest that can be detected using the LRO’s instruments).

Based on these observations, the researchers now believe that the top inch of the moon’s surface layer changes once every 80,000 years, not once every million years as previously believed, said Engadget. While such activity would not pose a threat to lunar settlers, it could change how they date samples collected by satellites and could impact any future plans to mine the moon.

The findings also suggest that the moon is hit by meteorites more frequently than researchers had previously thought, said Kathleen Mandt from the Southwest Research Institute. “I like it when theories are proven wrong, or exciting new things come up,” she told New Scientist, pointing out that the LRO mission “is starting to show there’s a lot we don’t know about the moon.”

Speyerer told Space.com that he was “excited” that he was able to see the lunar surface “evolve and churn – a process that was believed to take hundreds of thousands to millions of years to occur – in images acquired over the past several years… As the mission continues, the odds increase of finding larger impacts that occur more infrequently on the moon. Such discoveries will enable us to further refine the impact rate and investigate the most important process that shapes planetary bodies across the solar system.”

—–

Image credit: Thinkstock

Researchers discover oldest-known dinosaur ‘squawk box’

For the first time, researchers have discovered the bird-like vocal organ of a fossil dinosaur related to modern-day ducks and geese, and the discovery suggests that its nonavian cousins lacked the ability to produce noises similar to the bird calls we hear today.

Formally known as a syrinx, the voice box in question was discovered in an Antarctic fossil belonging to a Vegavis iaai specimen that lived in the Cretaceos era roughly 66 million years ago. This bird-like creature was discovered on Antarctica’s Vega Island in 1992 and identified more than a decade later, but its vocal organ was not discovered until 2013.

“This finding helps explain why no such organ has been preserved in a non-bird dinosaur or crocodile relative,” Julia Clarke, a paleontologist at the University of Texas at Austin Jackson School of Geosciences and the person who made the discovery, said in a statement.

She added that the find was “another important step to figuring out what dinosaurs sounded like as well as giving us insight into the evolution of birds.” Clarke and her colleagues reported their breakthrough in this week’s edition of the international, peer-reviewed journal Nature.

Analysis of the organ could provide new insight into vocalizations

The discovery of the syrinx, coupled with its absence of nonavian dinosaurs which lived during the same era, suggests that it might have originally developed late in the evolution of birds, and that other dinosaurs may have been unable to produce bird-like calls, the researchers noted.

Made from stiff, cartilage rings, the syrinx supports the soft tissues that vibrate to produce each of the sounds used to create complex modern bird calls and songs. While cartilage typically does not fossilize as well as bone or other hard tissues, its high mineral content sometimes allows it to be preserved, which was the case in this particular set of Vegavis iaai remains.

Credit: J. Clarke/UT Austin.

Credit: J. Clarke/UT Austin.

Clarke, who was the first scientist to describe the species 11 years ago, first discovered this so-called “squawk box” in 2013 and spent much of the next two years combing the dinosaur record for evidence of other, similar discoveries. To date, none have been found. They also scanned the syrinxes of a dozen living birds, as well as the next-oldest fossilized syrnix, to compare with the Vegavis iaai organ, hoping to learn more about what early bird calls sounded like.

“Here, we begin to outline how fossilizable characteristics of the syrinx may inform us about sound features, but we need a lot more data on living birds,” study co-author Franz Goller, a physiologist from the University of Utah, explained. “Remarkably, prior to this work, there is almost no discussion of these important questions.”

Combined with previous research led by Clarke, which found that some dinosaurs likely made closed-mouth vocalizations that did not require a syrinx, the research provides new insights into the evolution of sound production throughout the lifespan of the dinosaurs. The development of vocal organs could also provide new insight into the development of features, including larger brains, she added.

—–

Image credit: Nicole Fuller/Sayo Art for UT Austin.

President Obama reinforces commitment to manned Mars mission

In an op-ed published by CNN.com on Tuesday, US President Barack Obama reconfirmed his commitment to send Americans to Mars within the next few decades while also emphasizing a desire to see humanity travel beyond the Red Planet sometime in the foreseeable future.

“One of my earliest memories is sitting on my grandfather’s shoulders, waving a flag as our astronauts returned to Hawaii,” the President wrote. “I still have the same sense of wonder about our space program that I did as a child. It represents an essential part of our character – curiosity and exploration, innovation, and ingenuity, pushing the boundaries of what’s possible and doing it before anybody else.”

“We have set a clear goal vital to the next chapter of America’s story in space: sending humans to Mars by the 2030s and returning them safely to Earth, with the ultimate ambition to one day remain there for an extended time,” he added. “The next step is to reach beyond the bounds of Earth’s orbit. I’m excited to announce that we are working with our commercial partners to build new habitats that can sustain and transport astronauts on long-duration missions in deep space.”

Making it to Mars, President Obama explained, would “require continued cooperation between government and private innovators.” Within the next two years, he wrote, private-sector firms would be sending astronauts to the International Space Station (ISS), and NASA would also be partnering with non-governmental companies to create new habitats for long-duration missions (a possible allusion to the Next Space Technologies for Exploration Partnerships or NextSTEP program, according to Space.com).

His column came just days before he is expected to meet with some of the country’s leading scientists and engineers at the White House Frontiers Conference, which begins on Thursday in Pittsburgh. The goal of the conference, the President noted, will be  to discover ways to improve scientific research and technological innovation in all corners of the US.

NASA chief details how the agency plans to accomplish the goal

In response to the President’s op-ed, NASA administrator Charles Bolden and chief White House science and technology advisor John Holdren announced a pair of new initiatives which they said would help the US space agency “build on the President’s vision” and take advantage of “public-private partnerships to enable humans to live and work in space in a sustainable way.”

The first involved the advancement of the NextSTEP program, an initiative which invited private companies to submit advanced propulsion, satellite and habitat concepts to be considered for use in future NASA projects, to Phase 2. The work was “promising,” Bolden and Holdren wrote, and in August, six companies were selected to create prototypes for deep space habitat modules.

The second initiative involved NASA’s invitation to private sector companies to come up with ideas for how to use an available docking port on the ISS. One potential use of said port, Bolden and Holdren wrote, would be to prepare future commercial stations to take over for the ISS when the station’s mission ends sometime in the 2020s.

Those companies “responded enthusiastically, and those responses indicated a strong desire by US companies to attach a commercial module to the ISS that could meet the needs of NASA as well as those of private entrepreneurs,” they said. “As a result… this fall, NASA will start the process of providing companies with a potential opportunity to add their own modules and other capabilities to the International Space Station.”

“Make no mistake, the Journey to Mars will be challenging, but it is underway and with each one of these steps, we are pushing the boundaries of exploration and imagination,” wrote Bolden and Holdren. Such programs ideally will bring NASA closer to fulfilling the President’s desire to one day be the man hoisting his grandchild on his shoulders, looking up towards the sky, “but instead of eagerly awaiting the return of our intrepid explorers, we’ll know that because of the choices we make now, they’ve gone to space not just to visit, but to stay.”

—–

Image credit: NASA/Hubble

Can ADHD Medication Relieve Fibromyalgia Symptoms

adhd, puzzled, mindfulness, mind, women, bad, amnesia, work, vision, thought, pain, pressure, confused, rethink, latin, glasses, many, state, critical, psychology, wonder, business, concept, problems, adult, stress, chaos, intelligence, overload, worries, people, memory, female, idea, girl, pondering, forget, solution, face, contemplate, student, person, employee, concentration, recurring, racing, remember, confusion, recall, pensive

Image: pathdoc/Shutterstock

Some people might say that having ADHD or any attention deficit disorder leads to a stressful life, one in which there is no “off” switch for your brain. Not everyone understands the complexities of the condition or even believes it really exists, not much different from the way people with fibromyalgia are sometimes treated. ADHD has the opposite effect of fibromyalgia in that it causes our brains to go in electrical overload and “short circuit”,  but this does cause forgetfulness like “fibro fog”.

Patients have discovered, either on their own or by a doctor’s prescription, that the same medications that treat ADHD can help alleviate symptoms of fatigue, memory fog and mood swings, providing people with fibromyalgia an unlikely source of relief. But self-medicating is a dangerous game, and the use of medication like Adderall or the less-effective Ritalin may not be the best option for you. Here’s a few things to think about when considering asking for ADHD medication to treat your fibro:

What Adderall is and what it does

Adderall is a central nervous system (CNS) stimulant that is a combination of amphetamine and dextroamphetamine, which affect chemicals in the brain and nerves that contribute to hyperactivity and impulse control. But another huge symptom this drug treats is narcolepsy, a chronic sleep disorder characterized by overwhelming daytime drowsiness and sudden feelings of sleepiness. This can be a damper on your lifestyle and affect your ability to function during the day, making narcolepsy and fatigue two of the worst shared symptoms of fibromyalgia and ADHD. While Adderall may feel like a God-send to some people with fibromyalgia, the drug does not always work for everyone. It will takes a doctor’s careful attention to prescribe the correct dosage and medications to avoid the “crash” that often follows usage of Adderall.

Finding a doctor to prescribe Adderall

Not all doctors agree on what medications do and don’t provide the best results, and part of the problem is that our bodies react differently to substances which alter our biochemistry. A dosage of 30 mg of Adderall might be the perfect amount for one person and cause sudden loss of life in a person with heart trouble or high blood pressure. It’s because of the severe possible side effects and the high rate of abuse that the drug is by prescription only. But not all doctors think of mentioning Adderall to a patient with fibromyalgia, and some have even laughed it off as a ridiculous option. Finding a doctor you trust and that respects patient rights is the first step to any successful medical encounter, and it may be the key to finding an honest opinion of whether or not Adderall is right for you.

Knowing when you should stop

Too much of a good thing is never good for anyone, and with an addiction-prone drug like Adderall or it’s cousin Ritalin too much can be life-threatening. Any time someone offers you a pill such as Adderall, you should immediately refuse, especially if you have fibromyalgia. One overlooked symptom of fibro that could complicate things is poor circulation, something that can cause problems when taking ADHD medication. While Adderall is likely to help relieve some of your fatigue and sleeplessness, it is not a cure all and shouldn’t be treated as such. You will probably have a list of medications to take along with the Adderall to cover the bases of fibromyalgia symptoms, but a careful prescription and moderate usage to start might put you on the right path to managing your fibro.

Researchers discover new dwarf planet in our solar system

Researchers from the University of Michigan have discovered a new dwarf planet in our solar system – a world that is 330 miles across, located approximately 8.5 billion miles from the sun, and takes 1,100 years to complete a single orbit, various media outlets are reporting.

The new world is currently known as 2014 UZ224, and according to Space.com and Astronomy Magazine, it is smaller than Pluto’s largest moon, Charon, and is now the third farthest object in the solar system. It is located in an area of the Kuiper Belt beyond the gravitational influence of Neptune and was located using an instrument called the Dark Energy Camera (DECam).

The discovery, which was made by a team of students lead by physics and astronomy professor David Gerdes, is somewhat unusual for the way that the dwarf planet was found. Typically, new worlds are discovered when scientists make observations across consecutive nights. However, in 2014 UZ224’s case, detections of the object came far more sporadically.

“Objects in the solar system, when you observe them at one instant and then a little while later, they appear to be in a different place in the sky,” Gerdes told NPR. In this case, however, he and his colleagues “often just have a single observation of the thing, on one night. And then two weeks later one observation, and then five nights later another observation, and four months later another observation. So the connecting-the-dots problem is much more challenging.”

Is it actually a dwarf planet, and did they also find Planet Nine?

Despite the difficulty, however, the UM researchers were able to use computer software capable of discerning the basic orbit of the newfound object (although its exact path of movement around the sun remains somewhat unclear). It is also uncertain if the object is actually a dwarf planet.

As Space.com explains, the smallest confirmed dwarf planet discovered to date is Ceres, which is 590 miles across, or 260 miles wider than 2014 UZ224. The world discovered by Gerdes’ team may be too small to actually earn the title of dwarf planet, according to reports. The decision will ultimately be up to the International Astronomical Union, but if confirmed, it would become the fifth officially recognized dwarf planet in our solar system.

In addition to the discovery of 2014 UZ224 Gerdes told NPR that his team may have also gotten an image of the so-called Planet Nine, a yet unconfirmed planet astronomers believe exists in the outer edges of the solar system and which is believed to be 10 times more massive than Earth. To date, there have been no actual sightings of this hypothesized new world.

“I’m excited about our chances of finding [Planet Nine]. I’m excited about the chances of the people in this room finding it,” he told the media outlet during in an interview conducted last month. “Of course I’m happy for humanity if someone else finds it. It would be the most exciting astronomical discovery in our lifetime, I think.” While no astronomers have been successful thus far, Gerdes told NPR that “the hunt” for the undiscovered world “is on.”

—–

Image credit: NASA/JPL-Caltech/T. Pyle (SS

ExoMars mission prepares to reach Mars early next week

A mission designed to analyze gases in the Martian atmosphere and send a lander to the surface of the Red Planet is scheduled to arrive at its destination this week, officials with the European Space Agency (ESA) confirmed in a recently-released media advisory.

The ExoMars mission, a collaboration between the ESA and the Russian-based Roscosmos State Corporation for Space Activities, is comprised of two spacecraft: the Trace Gas Orbiter (TGO), a probe which will make a detailed inventory of the different gases in the planet’s atmosphere, and the Schiaparelli lander, which will complete its descent three days after arrival.

The two ExoMars spacecraft launched together on March 14, and according to Space.com, they will begin their separation upon arriving at Mars on October 19. Upon their arrival, the Schiaparelli lander will detach from the orbiter (which is carrying it to the Red Planet), and three days later, it will descend and land in the region informally known as Meridiani Planum.

Schiaparelli’s descent, which will bring the lander close to Mars’ equator, will reportedly occur at a speed of approximately 13,000 mph (21,000 km/h), and the spacecraft will have around six minutes to slow to a safe landing velocity. It will accomplish this by using sensors to monitor its altitude, starting when it reaches a height of four miles (seven km) above the ground.

Once the lander reaches a height of roughly 6 1/2 feet (two meters), it will briefly hover before cutting its thrusters and falling to the ground. The process, which is detailed in a new video from the ESA (seen above), will be followed by a planned test of Schiaparelli’s science instruments, which agency officials noted is scheduled to last for at least two days.

Lander to study surface conditions; orbiter to seek source of rare gases

A large part of the Schiaparelli lander’s mission is to test out several new technologies in order to prepare for future missions to the Red Planet, including the upcoming ExoMars 2020 project. Among those technologies are a heatshield, parachute and propulsion system, the ESA said.

Furthermore, the lander is equipped with a small suite of instruments that will record the wind speed, humidity, pressure and temperature near its landing site. It will also be collecting the first ever measurements of electric fields on the surface of Mars – something that researchers believe could help them learn more about the origins of the dust storms that take place.

ExoMars 2016

What will we learn about the Red Planet? (Credit: ESA/MediaLab)

Last week, ESA controllers uploaded a series of time-tagged operations that make sure that the lander would be able to carry out its mission, even if they lost contact with it. Those commands, which were uploaded in two separate batches, will also ensure that Schiaparelli is able to wake up from its power-saving sleep mode in time to communicate with researchers on Earth.

As for the TGO probe, it was designed to collect information about the gases in the atmosphere of the Red Planet, with a particular interest in rare gases such as methane, in order to determine if there is an active source somewhere on or beneath the surface. The goal is to determine if the gas is originating from a geological source or a biological one, according to the agency.

The orbiter will undergo a series of complex aerobraking maneuvers to correct its orbit upon its arrival, the ESA noted. That process will take nearly a full year, meaning that TGO will not start its science observations until late 2017. The probe will also be serving as a relay for the ExoMars 2020 mission, which will feature a rover and surface science platform, they added.

—–

Image credit: ESA/Videolab

Climate change doubled forest fire destruction in western United States

While scientists have long hypothesized that climate change could cause wildfires in the western US to worsen, they had no way of quantifying the increase in damage – until now, thanks to a new study published Monday in the Proceedings of the National Academy of Sciences.

Now, researchers have determined human-driven climate change has doubled the amount of land burned by wildfires over the last three decades – an increase of 16,000 square miles, or about the same size as Massachusetts and Connecticut combined, according to the Los Angeles Times.

“No matter how hard we try, the fires are going to keep getting bigger, and the reason is really clear,” study coauthor Park Williams, a bioclimatologist at Columbia University in New York, explained in a statement. “Climate is really running the show in terms of what burns. We should be getting ready for bigger fire years than those familiar to previous generations.”

“A lot of people are throwing around the words climate change and fire – specifically, last year fire chiefs and the governor of California started calling this the ‘new normal,’ lead author John Abatzoglou, a professor of geography from the University of Idaho, added. “We wanted to put some numbers on it.”

Trend expected to worsen over the next ‘three to four decades’

Abatzoglou, Williams and their colleagues used wildfire data, large-scale climate models and eight established techniques for measuring forest aridity, the Times explained. They determined that between 1979 and 2015, climate change increased the dryness of this land by 55%. Half as much land would have burned if these changes never took place.

Furthermore, the researchers discovered the role of climate in increasing the aridity of wildfire fuel has increased since 2000 and would continue to do so. In short, as the Washington Post said, they found a strong link between increased dryness and how much land impacted by forest fires, that more than 75% of the changes in burned area over the past 30 years was due to dryness.

California forest fire

Forest fires in California have been a huge problem for the state as droughts increase in frequency and intensity (Credit: Thinkstock)

They then turned their attention to see what percentage of those changes were the direct result of human-caused climate change. They found that warming caused by our activities caused slightly more than half of observed increase in fuel dryness, and that the other half could be attributed to natural climate variations, according to the Post. Nonetheless, their findings indicate that humans were responsible 16,000-plus square miles of additional forestland being burned since 1984.

This summer, approximately three million acres of forest have burned throughout the US, with most of that taking place in the western states. While that’s a significant amount, the authors are quick to note that it is not a record – although some scientists believe that the worst may be yet to come, as in some locations, the most dangerous conditions take place during the last four months of the year, when desert winds interact with increasingly dry environments.

Some experts believe that eventually forests will become too few and far between for wildfires to spread easily. However, Williams believes that “there’s no hint we’re even getting close to that yet. I’d expect increases to proceed exponentially for at least the next few decades.” In fact, he told the Post that there was “a very high likelihood” that wildfires “in the next three to four decades” would be “dwarfing the fires that we see today.”

—–

Image credit: Thinkstock

New lung cancer treatment described as ‘brilliant’

Researchers have developed an antibody that pushes the immune system to fight tumor cells in a manner more effective than customary chemotherapy for individuals with advanced non-small-cell lung cancer (NSCLC), according to a new study presented at the European Society for Medical Oncology conference this weekend in Copenhagen, Denmark.

In a three-phase clinical trial, previously untreated patients provided with the drug, known as pembrolizumab, showed a response more frequently, lived longer, and had less indications of disease advancement after 10 months than patients undergoing chemotherapy. More specifically, the drug was found to cut the chance of disease progression in half and lower overall deaths by 40 percent, as opposed to treatment with chemotherapy alone.

Amazing Outcomes

The outcomes were so brilliant that scientists paused the trial so all patients under their care could switch to the drug. The findings of the trial have also been published in the New England Journal of Medicine.

“Remember this day. It’s a new day for lung cancer treatment,” Stefan Zimmermann of the University Hospital in Lausanne Switzerland told reporters at the European conference.

Pembrolizumab, developed by the company Merck under the brand name Keytruda, has been approved as a second-line medical treatment for particular advanced head-and-neck cancers by the FDA, but its value as a first-line treatment had not yet been proven. In an earlier trial, a comparable drug called Opdivo did not hit its expected target, with outcomes revealing it was no more effective than chemotherapy for treating lung cancer.

The new trial, however, was meant for patients with a specified kind of NSCLC. The majority of these patients’ cancer cells held PD-L1, a protein that blocks white blood cells from needlessly killing off good cells, but can also aid cancer cells in avoiding destruction. According to the research team, about a quarter of advanced NSCLC cases meet this criterion.

For these patients, Keytruda decreased the risk of death during the study by around 40 percent as opposed to conventional medical care. A smaller trial of Keytruda discovered that combined treatment with chemotherapy proved even more successful than either therapy by itself.

—–

Image credit: Thinkstock

Changing human ‘flavor’ to mosquitoes could eradicate Malaria

From malaria to Zika, the West Nile virus to yellow fever, mosquitoes transmit some of the most potent and potentially dangerous diseases on the planet, but research published earlier this month in the journal Nature Communications has identified a novel way to combat some of them.

In the study, Dr. Christopher Potter, an assistant professor of neuroscience at the Johns Hopkins University School of Medicine, and his colleagues reported on the identification of a specialized region of the insect’s brain that combines smells and tastes to create its favorite flavors.

Using this knowledge, they propose the possibility that a substance could be used to make the “flavor” of humans repulsive to malaria-bearing mosquito species. By making us taste bad, the researchers explained, they could discourage the insects from biting people, thus preventing the spread of a disease that kills a reported 450,000 men and women annually.

“All mosquitoes, including the one that transmits malaria, use their sense of smell to find a host for a blood meal,” Dr. Potter said Monday in a statement. “Our goal is to let the mosquitoes tell us what smells they find repulsive and use those to keep them from biting us.”

mosquito

A mosquito with modified olfactory neurons (Olena Riabinina and Courtney Akitake, Johns Hopkins Medicine)

New technique identifies the neurons that process smell, taste

Smell is essential to mosquito survival, the study authors explained. For this reason, each of the insects uses three distinct sets of organs to detect odors: two antennae, two maxillary palps (thick appendages protruding from the bottom of their heads) and two labella (regions that contain both olfactory and taste-sending neurons and are located at the tip of the proboscis).

Dr. Potter and his team focused their studies on the female Anopheles gambiae mosquito, which is the species of insect responsible for spreading malaria by transmitting infectious parasites, by learning more about how the creature receives and processes olfactory data through each of these different sensory regions, and which regions of the brain this information travels to.

Using a powerful genetic technique that the researchers claim had never been used successfully in mosquitoes before, they made it so that the neurons which received complex odors through a type of protein known as an odorant receptor (OR) would glow green. These OR neurons, they explained, have been shown to distinguish humans from other types of warm-blooded mammals in the Zika-carrying Aedes aegypti mosquitoes, Dr. Potter’s group explained.

“This is the first time researchers managed to specifically target sensory neurons in mosquitoes. Previously, we had to use flies as a proxy for all insects, but now we can directly study the sense of smell in the insects that spread malaria,” explained lead author Dr. Olena Riabinina, currently a postdoctoral fellow at Imperial College London. “We were pleasantly surprised by how well our genetic technique worked and how easy it is now to see the smell-detecting neurons.”

Discovery could lead to development of new type of insect repellant

The researchers discovered that OR neurons from the antennae and maxillary palps traveled to the antennal lobes, a symmetrical part of the brain which serves a similar function in flies. Much to their surprise, however, they found that OR neurons from the labella were sent to a part of the brain previously unassociated with smell known as the subesophageal zone.

Since the subesophageal zone had previously been linked to the sense of taste, Dr. Potter said that the new discovery “suggests that perhaps mosquitoes don’t just like our smell, but also our flavor. It’s likely that the odorants coming off our skin are picked up by the labella and influence the preferred taste of our skin, especially when the mosquito is looking for a place to bite.”

This discovery could provide scientists with a new way to repel these disease-causing insects. Since mosquitoes use their labella to directly probe a person’s skin before biting them, the new research suggests that some substance could be used to target the labellar neurons, causing the insects to be disgusted by the smell or test of our skin before actually breaking the skin.

Dr. Potter believes that such a repellant could be combined with another substance that keeps mosquitoes at bay by targeting the antennal neurons, and that the genetic system his team came up with could make it easier to find odorants that are “safe and pleasant-smelling” for humans, but which are “strongly repellant to mosquitoes at very low concentrations.”

They also plan to further analyze the neurons of these mosquitoes to learn how signals from all three of the insects’ olfactory receptors interact to influence their behavior. “We’d like to figure out what regions and neurons in the brain lead to this combined effect,” Dr. Potter noted. “If we can identify them, perhaps we could also stop them from working.”

—–

Image credit: Thinkstock

Six-year investigation leads to discovery of new ichthyosaur species

A six-year investigation into a large collection of marine reptile fossils has paid off in a big way for a group of British and American paleontologists, who this week announced the discovery of two previously unknown species of Jurassic Era sea-dwellers known as ichthyosaurs.

With their discovery, the total number of species of these so-called “sea dragons,” which lived approximately 200 million years ago and grew to lengths of up to 15 meters (over 49 feet), has now reached six, BBC News science reporter Helen Briggs wrote in a Thursday article.

Manchester University paleontologist Dean Lomax and his colleague Judy Massare at the New York-based College at Brockport spent six years examining several hundred ichthyosaur fossils in both Europe and North America, including some which had not been seen in decades.

One of the new species was identified using fossils that had been on display at the University of Bristol’s School of Earth Sciences for many years, while the other was part of a fossil collection that was sent to the US and later donated to the Academy of Natural Sciences in Philadelphia.

As Lomax, who along with Massare detailed the findings in the journal Papers in Paleontology, explained earlier this week in a statement , “It is our hope that other similar fossils will be found in uninspected collections and brought to the attention of paleontologists… Who knows what else is waiting to be (re)discovered?”

Newfound specimens were essentially hiding in plain sight

Both sets of fossils used to describe the new species originated from Somerset in southwestern England, the researchers told BBC News. The first has been on display at Bristol University, in direct view of countless students and faculty members for several decades, and has been named Ichthyosaurus larkini in honor of UK  paleontologist Nigel Larkin.

The second specimen is believed to have been collected from a quarry in Glastonbury, Somerset, in the 1840s by a researcher named Edward Wilson. Wilson likely sent the fossils to Delaware so that his brother, Dr. Thomas Wilson, could study them. In 1847, Thomas donated the remains to the Academy of Natural Sciences, where it quietly remained in storage for decades.

The researchers decided to name the Philadelphia species Ichthyosaurus somersetensis in honor of Somerset country, which Lomax explained has been the site of many “sea dragon” fossil finds over the years. Lomax went on to call this particular specimen “in my opinion… the best example of Ichthyosaurus collected to date.” Long kept in storage, it will now be put on display.

“’These are two new species – brand new species to science,” Lomax told Briggs. “They show that during the early Jurassic – around 200 million years ago – the ichthyosaur, and specifically this particular type, was a lot more diverse than previously thought.”

—–

Image credit: Wikimedia Commons