NOAA Releases Whale-Locating App for iPhone & iPad

Piloting a ship and trying to avoid colliding with a group of endangered whales? There’s an app for that.

According to Reuters reporter Ros Krasny, the iOS app is known as “Whale Alert” and is available as a free download to all iPhone 3 and iPad users. It was created by the National Oceanic and Atmospheric Administration (NOAA), along with the National Park Service (NPS), the Coast Guard, and other government organizations, educational institutions, and conservation groups.

Designed in order to protect the North Atlantic right whale, of which there are just an estimated 350 to 550 left alive in the world, the app acquires global positioning system (GPS) information and other technology to transmit the latest data regarding where the whales are gathering, Krasny said. That information is then placed over NOAA digital charts and displays it on the user’s smartphone or tablet computer, allowing captains to change course to avoid the massive mammals or slow down with the hopes that they will move on.

“Whale Alert represents an innovative collaboration to protect this critically endangered species,” David Wiley, NOAA´s Stellwagen Bank National Marine Sanctuary research coordinator and project lead, said in a statement Wednesday. “Whale conservation is greater than any one organization and this project shows how many organizations can unite for a good cause.”

The app, which took 18 months to develop, was designed specifically with cruise and shipping vessels travelling along the Eastern Coast of North America in mind, Wired‘s Alexandra Chang said.

Prior to the release of Whale Alert, captains who wished to receive whale conservation data had to rely on radio, email, or faxes in order to learn of their locations, and determining the location of the mammals in relation to the ship requesting the information took a lot of time. However, as Patrick Ramage, the Global Whale Program Director for the International Fund for Animal Welfare (IFAW), one of the organizations that worked on the app, told Chang, the new app “integrates all of this information and puts it in real time.”

“iPad technology is changing ℠I can´t´ to ℠I can,'” he added.

Whale Alert was co-developed by EarthNC, a Florida-based company specializing in spatial mapping systems and the integration of data like weather feeds in mobile apps, and Gaia GPS, an iOS and Android software developer focusing on topographical mapping apps and route-planning applications.

Whale Alert can be downloaded free of charge from the App store. More information on Whale Alert and the groups responsible for its development can be found at http://stellwagen.noaa.gov/protect/whalealert.html.

Image 1: North Atlantic right whales are one of the world´s rarest large animals and are on the brink of extinction. Credit: NOAA, Taken under NOAA Fisheries Permit #605-1904

Image 2: For the first time, mariners operating along the U.S. east coast can receive a visual display of all relevant right whale management initiatives and warnings via their iPad or iPhone, including Seasonal and Dynamic Management Areas, Mandatory Ship Reporting areas, recommended routes, and automatic whale alerts triggered by acoustic detection buoys. A GPS system in the iPad shows the ship’s location relative to the management measures, simplifying mariner compliance. Clicking on a screen or icon activates a pop-up window with additional information. The accompanying image shows the iPad display as seen by a mariner approaching Boston Harbor. Credit: NOAA

Space Archive, Supernova Named in Honor of Maryland Senator

[ Watch the Video ]

Both one of the world’s largest astronomy archives and a supernova have been named in honor of the US Senator currently serving as the chairwoman of the Appropriations Subcommittee on Commerce, Justice, Science and Related Agencies, various media outlets have reported.

According to a Thursday UPI article, a facility which is home to astronomical observations of various stars, planets, and galaxies originating from the Hubble Space Telescope and 15 other NASA space science missions will be named the Barbara A. Mikulski Archive for Space Telescopes (MAST) in honor of the Senator, a Democrat from Maryland.

“In celebration of Sen. Mikulski’s career-long achievements, and particularly this year, becoming the longest-serving woman in U.S. Congressional history, we sought NASA‘s permission to establish the Senator’s permanent legacy to science by naming the optical and ultraviolet data archive housed here at the Institute in her honor,” Matt Mountain, director of the Space Telescope Science Institute (STScI), the organization which maintains the archives, said in a statement.

MAST is currently home to an estimated 200 terabytes of data, which according to STScI is nearly the same amount of information contained in the U.S. Library of Congress. Among the content housed there are all of NASA’s optical and ultraviolet light observations from the past three decades, added Space.com.

Furthermore, an exploding star first spotted by the Hubble back on January 25, has also been named for the Maryland Senator. The newly christened Supernova Mikulski, which is 7.4 billion light years away, has a mass eight times that of our Sun, according to an STScI press release. The star was named by Nobel Laureate Dr. Adam Riess and a team of supernova seekers that is currently assisting him.

“I’m humbled and honored to be recognized by our nation’s top scientists and innovators as a fighter for science and research,” Mikulski said in a statement. “I believe in American exceptionalism; not just because we say we are, but because of our investment in innovation. Through innovation, America has led the way in scientific breakthroughs and discoveries, which inspire future scientists, inventors, and entrepreneurs.”

“I am proud to be the namesake of the archive at the Space Telescope Science Institute, which is the enduring legacy of Hubble, and will allow us to peer even further into the origins of the universe after the launch of the James Webb Space Telescope,” she added.

Image 1: This is a view of the many computers that are part of the Barbara A. Mikulski Archive for Space Telescopes (MAST), located at the Space Telescope Science Institute in Baltimore, Md. The archive is named in honor of the United States Senator from Maryland for her career-long achievements and for becoming the longest-serving woman in U.S. Congressional history. MAST is NASA’s repository for all of its optical and ultraviolet-light observations, some of which date to the early 1970s. The archive holds data from 16 NASA telescopes, including current missions such as the Hubble Space Telescope. Senator Mikulski is at picture center, STScI Director Matt Mountain at her right, and STScI Deputy Director Kathryn Flanagan at her left. The plaque is a photo of Supernova Milkuski, an exploding star that the Hubble Space Telescope spotted on Jan. 25, 2012. It was named in honor of the Senator by Nobel Laureate Adam Riess and the supernova search team with which he is currently working. Credit: NASA, ESA, STScI, and J. Coyle (Coyle Studios)

Image 2: Supernova Mikulski. Credit: NASA, ESA, S. Faber (UCSC), A. Riess (JHU/STScI), S. Rodney (JHU), and the CANDELS team

EASTER EGG: TSA

Report Reveals Foods Most Likely To Be Fraudulent

An analysis of the first known public database collecting reports on food fraud and economically-inspired adulteration in the industry has revealed the ingredients most likely to be at the center of such scams, the US Pharmacopeial Convention (USP)– the organization that created the database — announced in an April 5 press release.
According to the USP’s review of scholarly journal reports, the full results of which were published in the April issue of the Journal of Food Science, the seven ingredients most involved in cases of food fraud are: olive oil, milk, honey, saffron, orange juice, coffee, and apple juice.
A Huffington Post report listed some of the primary adulterants in each of those seven ingredients:
– Olive oil – deodorants, corn oil, hazelnut oil and palm oil
– Milk – whey, bovine milk protein, melamine and cane sugar
– Honey – high-fructose corn syrup, glucose, fructose, and more
– Saffron – sandlewood dust, starch, yellow dye, gelatin threads, and more
– Orange juice – fungicide, grapefruit juice, marigold flower extract, corn sugar and paprika extract
– Coffee – chicory, roasted corn, caramel, malt, glucose, leguminous plants and maltodextrins
– Apple juice – arsenic, high-fructose corn syrup, raisin sweetener and synthetic malic acid
MSNBC.com‘s Rob Neill notes that the study was commissioned by the Department of Homeland Security.
For the purposes of the study, food fraud is defined as a “collective term that encompasses the deliberate substitution, addition, tampering or misrepresentation of food, food ingredients or food packaging, or false or misleading statements made about a product for economic gain,” the Huffington Post added.
“This database is a critical step in protecting consumers,” Dr. John Spink of Michigan State University, one of the researchers involved in the study, said in a statement.. “Food fraud and economically motivated adulteration have not received the warranted attention given the potential danger they present.”
“We recently defined these terms and now we are defining the scope and scale,” he continued. “As many do not believe a concept or risk exists if it does not appear in a scholarly journal, we believe that publication of this paper in the Journal of Food Science will allow us to advance the science of food fraud prevention.”
“Well-designed compendial testing approaches can be very powerful tool for guarding against food fraud,” added lead author Dr. Jeffrey C. Moore. “Their potential to detect both unknown and known adulterants is a significant benefit in an environment where no one knows and is worried about what harmful adulterant criminals will use to create the next generation of fake food ingredients.”
The USP Food Fraud Database can be viewed online at www.foodfraud.org.

Scientists Developing Future Of Robotics

Lee Rannals for RedOrbit.com

Researchers are trying to create a platform that would allow individuals to create and customize easy-to-use robotic devices.

The team hopes to develop a project that would automate the process of producing functional 3D devices and allow individuals to design and build functional robots from materials like a sheet of paper.

“This research envisions a whole new way of thinking about the design and manufacturing of robots, and could have a profound impact on society,” MIT Professor Daniela Rus, leader of the project and a principal investigator at the MIT Computer Science and Artificial Intelligence Lab (CSAIL), said in a recent press release. “We believe that it has the potential to transform manufacturing and to democratize access to robots.”

The project, known as “An Expedition in Computing for Compiling Printable Programmable Machines,” is compiled of researchers from MIT, the University of Pennsylvania and Harvard University.

The team hopes to create a platform that would allow individuals to identify household needs, then head to a local print shop to select a blueprint and customize a robotic device to solve the problem.

“This project aims to dramatically reduce the development time for a variety of useful robots, opening the doors to potential applications in manufacturing, education, personalized healthcare, and even disaster relief,” Rob Wood, an associate professor at Harvard University, said.

The team is developing an application programming interface for simple function specification and design, and also writing algorithms to allow control of the assembly of a device and its operations.

“Our goal is to develop technology that enables anyone to manufacture their own customized robot. This is truly a game changer,” Professor Vijay Kumar, who is leading the team from the University of Pennsylvania, said. “It could allow for the rapid design and manufacture of customized goods, and change the way we teach science and technology in high schools.”

The team is also creating an easy-to-use programming language environment so any individual could understand it.

They have two prototyped machines for designing, printing and programming, including an insect-like robot that could be used for exploring a contaminated area, and a gripper that could help people who have limited mobility.

“It´s really exciting to think about the kind of impact this work could have on the general population — beyond just a few select people who work in robotics,” Associate Professor Wojciech Matusik, also a principal investigator at CSAIL, said.

Rob Wood, an associate professor at Harvard University, said they hope this project will dramatically reduce the development time for a variety of robots, from “opening the doors to potential applications in manufacturing, education, personalized healthcare, and even disaster relief.”

The researchers wrote on an overview of the project on their website that they aim to transform manufacturing as dramatically as personal computers have transformed how we communicate.

“Rescuers engaged in humanitarian aid and disaster reliefs in remote locations could minimize their logistic needs on-site,” they wrote. “Warehouses of spare and replacement parts that may never be used could be replaced by storing only their designs digitally, not the physical parts themselves.”

Image Caption: An insect-like robot printed and designed using the new process being developed to revolutionize the way robots are developed. The robot could be used for exploring areas inaccessible to humans. Photo By Jason Dorfman, CSAIL/MIT

Were Easter Eggs Inspired By Dinosaurs?

A new study comparing the eggs of various biological species has determined that some types of Easter eggs purchased in the marketplace may actually have been inspired by dinosaurs, not birds.

According to an Asian News International (ANI) report, research conducted by paleontologists in Spain and the UK analyzed fossil eggs estimated to be 70 million years old that were discovered in the Pyrenees. Their goal was to determine whether or not the eggs in question had been laid by birds or dinosaurs.

Building upon that initial study, a team from the University of Leicester in England compared the shapes of various Easter eggs to both actual bird and actual dinosaur eggs, in order to determine which species inspired their shapes. Their findings have been published in the most recent edition of the journal Palaeontology.

“We found that different species have different shaped eggs, and that the eggs of dinosaurs are not the same shape as the eggs of birds,” said Enric Vicens of the Universitat Autonoma of Barcelona, co-author of the first study comparing the dinosaur and modern bird eggs, said in a statement Wednesday.

In order to do so, Vicens and co-author Nieves Lopez-Martinez of the Universidad Complutense of Madrid developed a mathematical formula in order to determine and provide descriptions for all possible egg shapes. They then applied their formula to actual eggs, and determined that dinosaur eggs “tend to be more elongate and less rounded than bird eggs,” as well as “more symmetrical with less distinction between the blunt and the more pointed end.”

Building upon Vicens and Lopez-Martinez’s work, Mark Purnell, a professor at the University of Leicester’s Department of Geology and a council member of the Palaeontological Association, attempted to discover which types of eggs provided inspiration for the Easter eggs commonly sold at shops around the world (and delivered to children by the Easter Bunny). He determined that there was a far greater diversity in the shapes of the eggs than there would be if a single species served as the universal source of them all.

“Many of the smaller eggs to be found commonly on the UK High Street are very similar in shape to hen’s eggs, providing strong clues to their original source. Others are more similar in shape to Condor eggs,” he said. “Perhaps more surprisingly a few eggs are closer in shape to those of dinosaurs, with one in particular being the same shape as the 70 million year old dinosaur egg, Sankofa pyrenaica, described by the Spanish team”.

University of Calgary Professor Darla Zelenitsky of the University of Calgary, an expert on dinosaurs and their eggs who did not work on their study, said, “It is really exciting to find these additional links between extinct dinosaurs and living birds — birds are living dinosaurs so it makes perfect sense that their eggs share such similarities“¦ Paleontologists have long suggested that small early mammals might have raided the nests of dinosaurs. Generally, the idea is that they stole the eggs for food, but if the evidence of this Easter egg research is reliable, perhaps early mammals had more playful and colorful motives.”

Image 2: The pale gray eggs are from birds, the darker gray eggs are from dinosaurs. Most Easter eggs, as shown on the right, are similar in shape to bird’s eggs, but some are closer to the eggs of dinosaurs. The Easter egg on the left is particularly close to the newly described egg Sankofa. Credit: Mark Purnell, University of Leicester

NASA Extends Kepler, Spitzer, Planck Missions

NASA is extending three missions affiliated with the Jet Propulsion Laboratory in Pasadena, Calif. — Kepler, the Spitzer Space Telescope and the U.S. portion of the European Space Agency’s Planck mission — as a result of the 2012 Senior Review of Astrophysics Missions.

The 2012 NASA Senior Review report, which includes these three missions and six others also being extended, is available at: http://science.nasa.gov/astrophysics/2012-senior-review/ .

“This means scientists can continue using the three spacecraft to study everything from the birth of the universe with Planck, and galaxies, stars, planets, comets and asteroids with Spitzer, while Kepler is determining what percentage of sun-like stars host potentially habitable Earth-like planets,” said Michael Werner, the chief scientist for astronomy and physics at JPL.

Kepler has been approved for extension through fiscal year 2016, which ends Sept. 30, 2016. All fiscal year 2015 and 2016 decisions are for planning purposes and will be revisited in the 2014 Senior Review. The extension provides four additional years to find Earth-size planets in the habitable zone — the region in a planetary system where liquid water could exist on the surface of the orbiting planet — around sun-like stars in our galaxy.

Spitzer, launched in 2003, continues to provide the astronomical community with its unique infrared images. It has continued to explore the cosmos since running out of coolant, as expected, in 2009. Among its many duties during its warm mission, the observatory is probing the atmospheres of planets beyond our sun and investigating the glow of some of the most distant galaxies known. As requested by the project, Spitzer received two additional years of operations. Like other NASA missions, the Spitzer team will be able to apply for a further extension in 2014.

NASA will fund one additional year of U.S. participation in the European Space Agency’s Planck mission, for the U.S. Planck data center and for operations of Planck’s Low Frequency Instrument. Planck, launched in 2009, is gathering data from the very early universe, shortly after its explosive birth in a big bang. Planck’s observations are yielding insight into the origin, evolution and fate of our universe. The U.S. Planck team will apply for additional funding after a third data release has been approved by the European consortiums.

Ames Research Center, Moffett Field, Calif., manages Kepler’s ground system development, mission operations and science data analysis. JPL managed the Kepler mission’s development. Ball Aerospace & Technologies Corp. in Boulder, Colo., developed the Kepler flight system and supports mission operations with the Laboratory for Atmospheric and Space Physics at the University of Colorado in Boulder. The Space Telescope Science Institute in Baltimore archives, hosts and distributes Kepler science data. Kepler is NASA’s 10th Discovery mission and is funded by NASA’s Science Mission Directorate at the agency’s headquarters in Washington. For more information about the Kepler mission, visit: http://www.nasa.gov/kepler and http://kepler.nasa.gov .

JPL manages the Spitzer Space Telescope mission for NASA’s Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology in Pasadena. Data are archived at the Infrared Science Archive housed at the Infrared Processing and Analysis Center at Caltech. For more information about Spitzer, visit: http://spitzer.caltech.edu and http://www.nasa.gov/spitzer .

Planck is a European Space Agency mission, with significant participation from NASA. NASA’s Planck Project Office is based at JPL. JPL contributed mission-enabling technology for both of Planck’s science instruments. European, Canadian and U.S. Planck scientists will work together to analyze the Planck data. More information is online at: http://www.nasa.gov/planck and http://www.esa.int/planck .

The California Institute of Technology in Pasadena, Calif., manages JPL for NASA.

Image Caption: From left to right, artist’s concepts of the Spitzer, Planck and Kepler space telescopes. NASA extended Spitzer and Kepler for two additional years; and the U.S. portion of Planck, a European Space Agency mission, for one year. The relative sizes of the artist’s concepts are not to scale. Image credit: NASA/JPL-Caltech

New Research Links Gene Mutations To Autism

Researchers reported in the journal Nature on Wednesday that about 15 percent of autism cases in families with a single autistic child are associated with spontaneous mutations that occur in six cells.

Several papers published in the April 4 issue of the journal show how gene mutations may contribute to the development of autism.

For the studies, the researchers sequenced the DNA of about 1,000 families, each of which had an autistic child.

The gene mutations identified in the study were discovered with a new genomics technology known as exome sequencing.

One group of researchers from Yale University said that with further characterization of the genes and sequencing of genes in thousands of families, they will be able to develop novel therapeutics and preventive strategies for autism.

“We now have a good sense of the large number of genes involved in autism and have discovered about 10 percent of them,” Joseph Buxbaum, PhD, Director of the Seaver Autism Center, said in a statement. “We need to study many more parents and their affected children if we are to uncover the genes important in ASD. As these genes are further characterized, this will lead to earlier diagnosis and novel drug development.”

The studies all found that de novo mutations account for a substantial fraction of the risk for autism.  These types of mutations show up in affected children for the first time and result from mutations in the production of sperm or egg.

“When the same mutations are found in multiple affected children and none are found in children without autism, we believe that we have identified mutations that collectively affect a higher proportion of individuals with autism,” Buxbaum said in a press release.

Researchers found that less than half of the autism cases studied carried a potentially protein-altering de novo point mutation.

“These data suggest that there is a role for de novo point mutations in the coding region of the genome for autism, but they do not constitute a sufficient cause,” Benjamin Neale, a research affiliate at the Broad Institute and an assistant in genetics at MGH, said in a press release. “That is to say, most de novo variants do not fully explain the disorder in an individual.”

The researchers were also able to link variations in three specific genes to a markedly increased risk for autism.

“Prior to the advent of new DNA sequencing technology, we were largely wandering in the dark searching for autism genes,” Matthew State, senior author of one of the papers, said in a press release. “Now we are getting a clear view of the genetic landscape and finally have the tools in hand to find a large proportion of the many genes contributing to autism.”

Studies involving twins also helps reveal a strong genetic component to autism, but a large number of cases occur in families that have not had a history with the condition.

The Yale study found that de novo mutations were more frequently in children born to older fathers, offering a partial explanation for the increased risk for autism in children of older parents.

The team believes that the percentage of autism cases linked to these de novo mutations will increase.

“With every new gene we discover, we learn more about potential treatments for patients with autism,” Stephan Sanders, lead author of one of the papers in Nature, said in a press release.

ESA, NASA Join Forces To Measure Arctic Sea Ice

Marking another remarkable collaborative effort, ESA and NASA met up over the Arctic Ocean this week to perform some carefully coordinated flights directly under CryoSat orbiting above. The data gathered help ensure the accuracy of ESA´s ice mission.
The aim of this large-scale campaign was to record sea-ice thickness and conditions of the ice exactly along the line traced by ESA´s CryoSat satellite orbiting high above. A range of sensors installed on the different aircraft was used to gather complementary information.
These airborne instruments included simple cameras to get a visual record of the sea ice, laser scanners to clearly map the height of the ice, an ice-thickness sensor called EM-Bird along with ESA´s sophisticated radar altimeter called ASIRAS and NASA´s snow and Ku-band radars, which mimic CryoSat´s measurements but at a higher resolution.
In orbit for two years, CryoSat carries the first radar altimeter of its kind to monitor changes in the thickness of ice.
As with any Earth observation mission, it is important to validate the readings acquired from space. This involves comparing the satellite data with measurements taken in situ, usually on the ground and from the air.
The teams of scientists from Europe, US and Canada expect that by pooling flight time and the results they will get a much-improved accuracy of global ice-thickness trends measured by CryoSat and NASA´s IceSat.
This will, in turn, lead to a better understanding of the impact of climate change on the Arctic environment.
Rene Forsberg, from the Technical University of Denmark´s National Space Institute, said, “As a scientist I value the collaboration very much.
“Data from a particular instrument provides one piece of the puzzle. Through experience in combining gravity and altimetry measurements over ice sheets, I´ve found that by combining measurements from different instruments you can solve the puzzle more easily and move forward.”
Coordinated campaign activities in these extremely cold and remote locations are difficult and include numerous challenges. The most obvious is the extreme weather. While much of Europe and North America is now enjoying the spring weather, temperatures in the high Arctic still often dip below —30° C.
These cold temperatures present challenges in running the aircraft and the complex scientific instruments on board, and of course, for the participants.
Distance and time zones are another challenge, because the NASA team is located in Thule, Greenland, and the ESA team is in Alert, Canada. Last, but not least, ESA´s satellite operations, such as orbit maneuvers and instrument settings, need to be coordinated with field activities to maximize the scientific return.
Despite these and many other challenges, the joint flights proved a resounding success.
On two occasions during the past week as the CryoSat satellite came over the horizon on the other side of the Arctic Ocean and raced across the frozen sea at over 6 km/second, the ESA and NASA planes met up along the coast and headed out over the frozen water within meters of each other to follow the line traced by CryoSat.
NASA´s Icebridge Project Scientist, Michael Studinger, said, “The joint ESA/NASA campaign has been incredibly successful again.
“It would be easy to view such a success as ordinary and lose sight of how difficult this whole undertaking really is. The skill and experience of all teams involved is the foundation for safety of operations and success in such an extreme environment.”
Malcolm Davidson, ESA’s CryoSat Validation Manager, added, “By joining forces and pooling their efforts, ESA and NASA are able to achieve much more than each agency would separately.
“The joint activities this week provide a vivid illustration of the many synergies that such a collaboration brings.”

Image Caption: A view of Arctic sea ice from NASA´s P-3 aircraft as it joins ESA to validate measurements of the ice taken from space by CryoSat. On 2 April, ESA and NASA planes flew together across the Arctic Ocean, exactly under CryoSat orbiting 700 km above. Credits: NASA/M. Studinger

On the Net:

Gladstone Scientists Find Increased ApoE Protein Levels May Promote Alzheimer’s Disease

Discovery challenges current thinking and points to new therapies

Scientists at the Gladstone Institutes have enhanced our understanding of how a protein linked to Alzheimer’s disease keeps young brains healthy, but can damage them later in life–suggesting new research avenues for treating this devastating disease.

In the Journal of Neuroscience, available online today, researchers in the laboratory of Yadong Huang, MD, PhD, have uncovered the distinct roles that the apoE protein plays in young vs. aging brains. These findings, which could inform the future of Alzheimer’s drug development, come at a time of unprecedented challenge and need.

“By the year 2030, more than 60 million people worldwide will likely be diagnosed with Alzheimer’s, but we are still grappling with the disease’s underlying biological mechanisms,” said Dr. Huang, an Alzheimer’s expert at Gladstone, an independent and nonprofit biomedical-research organization. “However, with this research we’ve shed new light on these complex processes–and how we could modify these processes to fight this disease.”

The molecular mechanisms behind Alzheimer’s have long evaded scientists. Early studies found that different types, or variants, of the apoE gene–including apoE3 and apoE4–influence one’s genetic risk for developing the disease. The apoE4 variant is the major genetic risk factor for the disease, while the apoE3 variant is less risky–and far more common. From among these variants, everyone inherits two–one from each parent–that provide a blueprint for making the protein known simply as apoE. Previous findings revealed a complicated interplay between apoE and another protein called amyloid-beta (Aβ), which is present in increased quantities in the brains of Alzheimer’s patients, but the exact nature of this complex relationship remains unclear.

Recent research by another group found that a drug that boosted apoE protein levels also reversed the build-up of Aβ in mice genetically modified to mimic Alzheimer’s. So some scientists have theorized that boosting apoE levels could be beneficial in slowing the disease’s progression in humans, and several groups have begun to explore this therapeutic strategy.

In this study, Dr. Huang and his team tested this idea. They genetically modified mice to have either human apoE3 or apoE4 and then monitored them for any subsequent build-up of toxic Aβ in their brains as they aged.

“We thought a straightforward relationship existed between apoE protein levels and Aβ, and that boosting apoE levels in these mice would promote–not halt–the build-up of Aβ,” explained Gladstone Postdoctoral Fellow and lead author Nga Bien-Ly, PhD.

The team’s experiments revealed both surprising and intricate roles for apoE. In young mice, apoE proteins produced by all variants of the apoE gene–even the risky apoE4 variant–are essential as the protein they build helps clear away excessive amounts of Aβ. But as the mice aged, this process began to malfunction–especially in those mice with two copies of the apoE4 gene but also in mice with two copies of apoE3. As apoE protein levels rose, Aβ began to accumulate. But in mice mutated to have only one copy of the apoE gene–either apoE3 or apoE4–apoE protein levels dropped by half and Aβ build-up was reduced. These results indicated that Aβ build-up isn’t associated only with a specific apoE variant, but instead is also related to the overall amount of apoE protein produced as the brain ages.

“Our findings suggest that reducing levels of proteins produced by either apoE3 or the apoE4–rather than raising them–could be key to lowering Aβ build-up in the brain,” said Dr. Huang, who is also an associate professor of neurology at the University of California, San Francisco, with which Gladstone is affiliated. “We hope that our research could spur new therapies that successfully combat Alzheimer’s at the molecular level–putting us one step ahead of this deadly disease.”

On the Net:

Is Fertilizer To Blame For Global Warming?

Chemists from the University of California, Berkeley have discovered evidence of a link between increased fertilizer use and a rise in atmospheric nitrous oxide, a major greenhouse gas.

Climate scientists have long assumed the cause of rising nitrous oxide levels in the atmosphere was due in part to nitrogen-based fertilizers. Such fertilizers are used to stimulate microbes in the ground to convert nitrogen to nitrous oxide at a faster rate.

Published in the April issue of the journal Nature Geoscience, the new study uses nitrogen isotope data to pinpoint direct blame on these nitrogen rich fertilizers.

“Our study is the first to show empirically from the data at hand alone that the nitrogen isotope ratio in the atmosphere and how it has changed over time is a fingerprint of fertilizer use,” said study leader Kristie Boering, a UC Berkeley professor of chemistry and of earth and planetary science, according to a Berkeley press release.

“We are not vilifying fertilizer. We can´t just stop using fertilizer,” she added. “But we hope this study will contribute to changes in fertilizer use and agricultural practices that will help to mitigate the release of nitrous oxide into the atmosphere.”

Nitrous oxide is the most potent of the greenhouse gasses, trapping heat and contributing to global warming. Nitrous oxide has also been found to destroy the stratospheric zone, which protects the Earth from ultraviolet rays.

These levels have been dramatically increased in the past 50 years, according to the study. Part of this increase coincided with the new “green” movement of the 1960s. As global population increased, cheap and synthetic fertilizers became widely available, boosting food production all over the world.

Boering and colleagues, former UC Berkeley graduate students Sunyoung Park and Phillip Croteau, gathered their samples from Antarctic ice, or “firn air,” dating from 1940-2005 as well as at an atmospheric monitoring station at Cape Grim in Tasmania.

Analysis of this data revealed a previously discovered seasonal cycle. What surprised these researchers, however, were the isotopic measurements found by a sensitive isotope ratio mass spectrometer. The measurements also discovered  a seasonal cycle, something the spectrometer had not done before. At Cape Grim, the seasonal cycle is due in part to the circulation of air returning from the stratosphere, where nitrous oxide is destroyed after an average of 120 years, and seasonal changes in the ocean.

“The fact that the isotopic composition of N2O shows a coherent signal in space and time is exciting, because now you have a way to differentiate agricultural N2O from natural ocean N2O from Amazon forest emissions from N2O returning from the stratosphere,” Boering said. “In addition, you also now have a way to check whether your international neighbors are abiding by agreements they´ve made to mitigate N2O emissions.”

Using these results, the UC Berkeley students are suggesting a limiting of nitrous oxide emissions, like those found in fertilizer, could be a first step in reducing greenhouse gasses and the effects of global warming.

According to a Berkeley press release about these findings, Boering said, “Limiting N2O emissions can buy us a little more time in figuring out how to reduce CO2 emissions.”

Image 1: Law Dome, Antarctica. Air trapped in the consolidated snow from this region provides historical air samples going back to 1940.

Image 2: The Cape Grim Baseline Air Pollution Station in Tasmania, where air samples have been collected since 1978. These samples show a long-term trend in isotopic composition that confirms that nitrogen-based fertilizer is largely responsible for the 20 percent increase in atmospheric nitrous oxide since the Industrial Revolution. Photo courtesy of CSIRO.

New Study Examines Teen Alcohol And Illicit Drug Use

Most US teenagers have used alcohol and drugs by the time they reach adulthood and more than 15 percent of them meet the criteria for substance abuse, according to a new survey published in the April issue of the Archives of General Psychiatry.
The survey of more than 10,000 US teens found that 4 out of 5 (78.2 percent) teens had tried alcohol before the age of 18. The results of the survey also showed that some 18 percent of adults meet the criteria for “lifetime abuse” of alcohol and 11 percent meet that criteria for drug abuse, with onset of abuse starting in teenage years for many.
“It´s in adolescence that the onset of substance abuse disorders occurs for most individuals,” lead author Joel Swendsen, director of research at the National Center of Scientific Research in Bordeaux, France, told Reuters. “That’s where the roots take place.”
For the study, Swendsen and colleagues examined the frequency, age of onset and socio-demographic factors related to alcohol and drug use and abuse by US teens. The cross-sectional survey included a nationally representative sample of 10,123 adolescents ages 13 to 18 years old. The study and survey were conducted between February 2001 and January 2004.
Researchers found that median age at onset was 14 years old for regular alcohol use with or without dependence; 14 years old for drug abuse with dependence; and 15 years old for drug abuse without dependence.
“Because the early onset of substance use is a significant predictor of substance use behavior and disorders in a lifespan, the public health implications of the current findings are far reaching,” the study authors noted.
Based on 3,700 teens in the survey who were between the ages of 13 and 14, the team found that roughly one in ten had consumed alcohol on a regular basis, defined as 12 drinks within a year. That number jumped to about one in five when the team surveyed 2,300 17- to 18-year-olds.
Swendsen and colleagues said nearly one in three of the regular users in the oldest age group met the criteria for lifetime alcohol abuse.
Sixty percent of the teens surveyed said they had the opportunity to use illicit drugs, such as marijuana, cocaine, stimulants and painkillers. About one in ten 13- to 14-year-olds said they had used at least one such drug, and that increased to 40 percent in the oldest age group. Marijuana was the most common type of drug used, followed by prescription drugs.
Swendsen and colleagues noted that while the probability of alcohol and drug use increased with age, the rates were almost always lowest in black and other racial/ethnic groups compared to white or Hispanic adolescents.
“The reason we worry about [drug and alcohol use] is that the earlier [teens] use these substances the earlier they become addicted to it,” Susan Foster, vice president and director of policy research and analysis at the National Center on Addiction and Substance Abuse at Columbia University in New York, told Reuters.
Foster, who was not part of the study, said using such drugs at such an early age is especially dangerous because the brain is still developing. “There´s really a type of rewiring that goes on with continued use than can result in an increased interest in using and an inability to stop using,” she added.
Foster, whose organization published a comprehensive report on substance abuse in adolescents last year, said the findings of the latest study are consistent with that research. “We´ve had spikes and declines of abuse across the population,” she said.
Swendsen and colleagues also noted that strategies need to target adolescents to prevent drug and alcohol abuse, but also need to take into account the different factors that influence it.
“We don´t need to bombard them with information that´s beyond their stage of development, but don´t think a 13-year-old doesn´t know what cannabis is,” Swendsen told Reuters.
“The prevention of both alcohol and illicit drug abuse requires strategies that target early adolescence and take into account the highly differential influence that population-based factors may exert by stage of substance use,” the authors concluded.

Need Relief From Job Stress? Take Your Dog To Work!

[ Watch the Video ]

Google on Sunday unveiled the process behind their Canine Staffing Team or “Dooglers” program. According to the company website, the team looks for canine applicants with a “steady paw” that can demonstrate a history of “great teamwork within a business kennel environment.”

Of course, Sunday was also April Fools´ Day, but Google is one of the many Fortune 500 top companies that allow employees to bring their furry companions to work, and a recent study shows that the policy may help boost job appreciation while lowering stress levels.

New research published in the International Journal of Workplace Health Management from a Virginia Commonwealth University team reinforces previous studies that link the presence of pets to less stress in humans.  Animals in hospitals and nursing homes have been shown to produce measurable positive results including lower blood pressures and faster recovery times for patients.

In the VCU study, researchers compared three different groups of workers: employees who bring their dogs to work, employees who do not bring their dogs to work, and employees who do not own dogs.

“Although preliminary, this study provides the first quantitative study of the effects of employees´ pet dogs in the workplace setting on employee stress, job satisfaction, support and commitment,” said principal investigator Randolph T. Barker, Ph.D., professor of management in the VCU School of Business.

Employees at Replacements Ltd., a large china and silverware retailer, have been allowed to bring their dogs to work for several years and were chosen as the participants in the VCU study. The 400-plus employees at the company have allowed man´s best friend in the office, in the warehouse, and even in the showroom, according to CBS News. About 20 to 30 dogs are on the company premises on any given day.

The study showed that during the course of the work day, self-reported stress declined for employees with their dogs present and increased for non-pet owners and dog owners who did not bring their dogs to work. The team noted that stress notably rose during the day when owners left their dogs at home compared to days they brought them to work. The researchers did not observe a difference among the employee groups with respect to stress hormone levels, which was measured via a saliva sample taken each morning.

According to Barker, unexpected coworker interactions occurred during the study that may contribute to employee performance and satisfaction. For example, employees without a dog were seen requesting to take a co-worker´s dog out on a break. The short exchanges that often resulted in the dog being taken outside could be seen as building positive coworker relationships and increasing the amount of employees´ stress-reducing exercise.

“Pet presence may serve as a low-cost, wellness intervention readily available to many organizations and may enhance organizational satisfaction and perceptions of support. Of course, it is important to have policies in place to ensure only friendly, clean and well-behaved pets are present in the workplace,” Barker said.

Besides Replacements Ltd. and Google, Amazon, Build-a-Bear, and Clif Bar are among the many other companies that allow dogs in the workplace.

Algorithm Paves Way For Future ‘Smart Sand’

Lee Rannals for Redorbit.com

Researchers from MIT have developed new algorithms that could enable “Smart Sand” to form into any shape.

By using the algorithm, individual Smart Sand grains would be able to communicate to one another, forming a three-dimensional object.

As messages are passed back and forth to each of the grains, they selectively attach to each other to form an object, while the unused grains simply fall away.

Once the object has served its purposed, the smart sand grains would then detach from each other, falling back to its original state, ready to form another shape.

The researchers had to develop an algorithm that would offer every grain of sand the ability to store a digital map of the object to be assembled.

During a video demonstration of the algorithm in a two-dimensional format, the grains work by first passing a message to each other to determine which have missing neighbors, creating empty areas in the structure.

This empty area displays a shape, in which the grains can take notice of to pass the message along to other grains to form.

Once the shape is formed, the empty area is filled in, and the object being formed by the Smart Sand is left.

The researchers used cubes, or “smart pebbles”, to test their algorithm on a simplified two-dimensional system.

Four faces of each cube have electropermanent magnets, which are materials used to magnetize or demagnetize a single electric pulse.

The magnets on these cubes can be turned on and off, and do not require a constant current to maintain their magnetism.

The pebbles use the magnets not only to connect to each other, but also to communicate and share power.

Each pebble has a microprocessor that is able to store 32 kilobytes of program code, and only has two kilobytes of working memory.

Kyle Gilpin, who worked on the project at MIT along with his professor Daniela Rus, told RedOrbit in an email that “while the Robot Pebbles aren’t going to turn into Smart Sand overnight,” it will eventually happen.

“It may be 10 years before we see modules capable of forming shapes whose resolution surpasses that of alternative fabrication methods, but we’ll see incremental improvements along the way,” Gilpin added.

“Consider how rapidly and dramatically computers have been miniaturized over the last 50 years.  What used to occupy an entire room now fits on a small fraction of a fingernail.  We’ll see the same advances applied to programmable matter systems as well. ”

He said the original inspiration for the concept of Smart Sand derived from the way a sculptor shapes their stone.

“Just like a sculptor removes material from a block of stone, we remove the extra Pebble modules from the system to reveal the shape underneath,” he told RedOrbit. “In many ways, it´s much easier to remove material than it is to add it.”

The researchers will be presenting a paper describing the algorithms of their findings at the IEEE International Conference on Robotics and Automation in May.

Image 2: To test their algorithm, the researchers designed and built a system of ‘smart pebbles’ – cubes about 10 millimeters to an edge, with processors and magnets built in. Photo: M. Scott Brauer

Seeing Beyond The Visual Cortex


[ Watch the Video ]

Research could lead to new rehabilitative therapies when visual cortex is damaged

It’s a chilling thought–losing the sense of sight because of severe injury or damage to the brain’s visual cortex. But, is it possible to train a damaged or injured brain to “see” again after such a catastrophic injury? Yes, according to Tony Ro, a neuroscientist at the City College of New York, who is artificially recreating a condition called blindsight in his lab.

“Blindsight is a condition that some patients experience after having damage to the primary visual cortex in the back of their brains. What happens in these patients is they go cortically blind, yet they can still discriminate visual information, albeit without any awareness.” explains Ro.

While no one is ever going to say blindsight is 20/20, Ro says it holds tantalizing clues to the architecture of the brain. “There are a lot of areas in the brain that are involved with processing visual information, but without any visual awareness.” he points out. “These other parts of the brain receive input from the eyes, but they’re not allowing us to access it consciously.”

With support from the National Science Foundation’s (NSF) Directorate for Social, Behavioral and Economic Sciences, Ro is developing a clearer picture of how other parts of the brain, besides the visual cortex, respond to visual stimuli.

In order to recreate blindsight, Ro must find a volunteer who is willing to temporarily be blinded by having a powerful magnetic pulse shot right into their visual cortex. The magnetic blast disables the visual cortex and blinds the person for a split second. “That blindness occurs very shortly and very rapidly–on the order of one twentieth of a second or so,” says Ro.

On the day of Science Nation’s visit to Ro’s lab in the Hamilton Heights section of Manhattan, volunteer Lei Ai is seated in a small booth in front of a computer with instructions to keep his eyes on the screen. A round device is placed on the back of Ai’s head. Then, the booth is filled with the sound of consistent clicks, about two seconds apart. Each click is a magnetic pulse disrupting the activity in his visual cortex, blinding him. Just as the pulse blinds him, a shape, such as a diamond or a square, flashes onto a computer screen in front of him.

Ro says that 60 to nearly 100 percent of the time, test subjects report back the shape correctly. “They’ll be significantly above chance levels at discriminating those shapes, even though they’re unaware of them. Sometimes they’re nearly perfect at it,” he adds.

Ro observes what happens to other areas of Ai’s brain during the instant he is blinded and a shape is flashed on the screen. While the blindness wears off immediately with no lasting effects, according to Ro, the findings are telling. “There are likely to be a lot of alternative visual pathways that go into the brain from our eyes that process information at unconscious levels,” he says.

Ro believes understanding and mapping those alternative pathways might be the key to new rehabilitative therapies. “We have a lot of soldiers returning home who have a lot of brain damage to visual areas of the brain. We might be able to rehabilitate these patients,” he says. And that’s something worth looking into.

On the Net:

Bald Barbie Campaign Convinces Mattel To Produce New Doll

A social networking campaign to make a bald version of the popular Barbie doll as a tribute to cancer patients or kids suffering from other disorders that result in hair loss has worked, as the toy’s manufacturer, Mattel, has announced a special new product that will be distributed exclusively through children’s hospitals in North America.
In a statement posted to their corporate Facebook page on March 27, Mattel said, “Play is vital for children, especially during difficult times. We are pleased to share with our community that next year we will be producing a fashion doll that will be a friend of Barbie, which will include wigs, hats, scarves and other fashion accessories to provide girls with a traditional fashion play experience. For those girls who choose, the wigs and head coverings can be interchanged or completely removed.”
“We will work with our longstanding partner, the Children´s Hospital Association, to donate and distribute the dolls exclusively to children´s hospitals directly reaching girls who are most affected by hair loss,” they added. “A limited number of dolls and monetary donations will also be made to CureSearch for Children´s Cancer and the National Alopecia Areata Foundation“¦ we made the decision not to sell these dolls at retail stores, but rather get the dolls directly into the hands of children who can most benefit from the unique play experience.”
According to Houston Chronicle blogger Francisca Ortega, the Facebook group responsible for bringing attention to the issue, “Beautiful and Bald Barbie! Let’s see if we can get it made“, had more than 150,000 fans as of March 29.
Likewise, a Change.org petition calling for the doll’s creation had gathered nearly 35,000 signatures, and a similar effort also led to the creation of the “True Hope” line of bald dolls with fashion accessories by Bratz and Moxie Girlz manufacturers MGA. Those toys are due out in June, according to Ortega’s report.
“MGA’s mission is to provide joy and happiness to kids around the world. We believe children are our legacy and want them to be healthy, have confidence in their imagination and build their dreams into reality,” MGA Entertainment CEO Isaac Larian said in a February statement.
“We have a responsibility to children and we take that responsibility very seriously,” he added. “The Bratz and Moxie Girlz ‘True Hope’ dolls are designed to support and comfort young girls and boys who so bravely endure cancer treatments.  MGA also wants to be an active supporter in the fight to develop lifesaving treatments for children.”
The “Bald and Beautiful Barbie” campaign was launched in January of this year by Jane Bingham and Rebecca Sypin, each of whom has lost daughters to cancer, according to Time.com‘s Erin Skarda.
Their campaign was further inspired by the story of a four-year-old Long Island girl named Genesis Reyes, according to a March 29 Daily Mail report. As the UK newspaper reports, Reyes “announced that she did not feel like a princess without her hair,” leading the parents of another youngster being treated at the same medical facility to ask Mattel’s CEO, a close friend, to create a unique, bald “Genesis” doll for the girl. After hearing of Reyes’ story, the Daily Mail says, Bingham and Sypin launched their Facebook page.
When the campaign began drawing attention throughout the U.S., Mattel invited the two women to their headquarters to discuss the possibility of developing a bald Barbie-franchise doll. During that visit, Skarda says that Bingham and Sypin were informed of the company’s plans to produce such a product in the near future.
As previously reported on RedOrbit, the two friends, neither of whom were experienced activists, said that their sole goal was to convince Mattel to introduce a line of bald Barbies in an effort to show solidarity and support for children suffering from cancer, Alopecia or Trichotillomania.
From the start, the women wanted to make it clear that they were not trying to strong-arm or shame the toy manufacturer into producing the bald doll. As Sypin said in January, “We´re not demanding that the company do anything“¦ We´re just hoping somebody sees this and can help us make it happen.”

Does Eating Fast Food Lead To Depression?

Those who are regular consumers of fast food products are over 50% more likely to become clinically depressed than those who abstain from burgers, fries, pizza and other related foods, researchers from the University of Las Palmas de Gran Canaria (ULPGC) and the University of Granada have discovered.

Furthermore, according to lead author Almudena Sánchez-Villegas, the study, which has been published in the journal Public Health Nutrition, also demonstrated a dose-response relationship, which essentially means that the more fast food or commercial baked foods (doughnuts, croissants, etc.) a person eats, the higher the risk that they will become depressed as a result.

The research also discovered that subjects who ate the highest quantities of these types of foods have poor overall dietary habits (i.e. eating fewer servings of fruits and vegetables, fish, and nuts) and poor exercise habits, the Spanish Foundation for Science and Technology (FECYT) said in a press release on Friday.

They are also more likely to be single, the researchers discovered, according to FECYT.

As part of their study, Sánchez-Villegas and colleagues followed a sample of nearly 9,000 individuals affiliated with the SUN Project (University of Navarra Diet and Lifestyle Tracking Program).

None of them had ever been diagnosed or treated for depression before the start of the study, and after an average of six month’s worth of assessment, nearly 500 of them had either received such a diagnosis or had started taking antidepressants.

“This new data supports the results of the SUN project in 2011, which were published in the PLoS One journal,” the FECYT media advisory said. “The project recorded 657 new cases of depression out of the 12,059 people analyzed over more than six months. A 42% increase in the risk associated with fast food was found, which is lower than that found in the current study.”

While Sánchez-Villegas admits that “more studies are necessary,” the researcher adds that “the intake of this type of food should be controlled because of its implications on both health (obesity, cardiovascular diseases) and mental well-being.”

However, in an interview with Dr. Alethea Turner of ABC News, Yale University´s Prevention Research Center Director Dr. David Katz suggested that the study may have the cause-effect relationship reversed.

“Higher intake of fast food may very well increase risks of depression by causing poor health in general. But depression may also increase fast food intake,” he said. “We use the term ℠comfort food´ for a reason. It can help alleviate stress, anxiety, and depression. So it may be that people with depression are turning to [fast food] for relief.”

Study Suggests Bacon Could Help Treat Arthritis

This story was originally published on April 1st, 2012 as part of an April Fool’s Day prank and promotion. It should in no way be considered as “real” news.

Researchers from a prominent American university believe that they have discovered an unusual (and very tasty) way to treat the symptoms of rheumatoid arthritis (RA) — bacon.

In studies conducted at Minnesota State University (MSU) in Minneapolis, lead author and biology professor Marty Lunde and colleagues found that those who consumed at least two strips worth of the popular cured, salted meat reported experiencing less joint pain and stiffness than RA sufferers who dined on a vegetarian breakfast.

According to an MSU press release, Lunde combined interviews with study participants as well as statistics obtained from the National Health Interview Survey (NHIS) to analyze both culinary choices for the first meal of each given day and each individual’s self-rating for pain/discomfort level in relation to his or her RA symptoms.

Lunde admits that he and his colleagues went into the study expecting to discover that breakfasts that were light on cured meat and heavy on healthier items such as yogurt or fruit would be better for those suffering from arthritis, and were stunned when they discovered that the opposite was actually true — that individuals who dined on heavier breakfasts, particularly those that included bacon, reported less discomfort on average.

“It definitely is a curious discovery, and one that none of us expected,” the MSU professor said in a statement. “It has long been accepted by the scientific community that cured meats like bacon and sausages were extremely poor choices when it comes to a healthy diet, but now we see that there may yet be some value in these types of foods — in moderation, of course.”

While the findings may be surprising, experts say that are not without medical basis.

“While medical experts have not been able to discern exactly what causes rheumatoid arthritis, we do know that it is an autoimmune disease which causes the body’s own natural defenses to attack healthy tissue in the joins,” Dean Simon, a general care physician and a member of the Lancaster International Endocrinology Society (LIES), told reporters on Friday.

Simon explained that the fat in bacon contains linoleic acid, which can help control inflammation in afflicted joints, and that the sodium content of the meat is involved both in the digestion of protein and the process of muscle contraction. However, he advises that despite the findings of the MSU study, that bacon may not be the best source of either fat or salt, because the fat content of the product is approximately 40% saturated and that it could contain upwards of 200mg of sodium per slice.

Previous studies have not been so kind to the popular breakfast meat. A 2007 Columbia University study discovered that those who ate cured meats, like bacon, 14 times per month or more had a higher risk of developing chronic obstructive pulmonary disease (COPD), according to BBC News reports.

Likewise, in a May 2010 study, researchers from the Harvard School of Public Health (HSPH) discovered that individuals who regularly consumed bacon or other processed meats like sausage had a 42% higher risk of contracting heart disease and a 19% higher risk of developing type 2 diabetes. Conversely, they discovered no increased risk of developing either condition among those who regularly ate unprocessed beef, pork, or lamb.

“When we looked at average nutrients in unprocessed red and processed meats eaten in the United States, we found that they contained similar average amounts of saturated fat and cholesterol. In contrast, processed meats contained, on average, 4 times more sodium and 50% more nitrate preservatives,” lead author Renata Micha, a research fellow in the department of epidemiology at HSPH, said in a statement.

“This suggests that differences in salt and preservatives, rather than fats, might explain the higher risk of heart disease and diabetes seen with processed meats, but not with unprocessed red meats,” Micha added.

Image Credit: Philip Stridh / Shutterstock

FDA Opts Not To Ban BPA In Food Packaging

The U.S. Food and Drug Administration (FDA) announced on Friday that they would not ban the use of a controversial chemical used in food packaging, various media outlets have reported.

According to Bloomberg‘s Jack Kaskey, the FDA rejected a request from environmental advocates seeking the agency to prohibit the use of bisphenol A, also known as BPA, in cans and other forms of packaging. In their ruling, the FDA determined that opponents of BPA, which has been used in epoxy linings for the past 50 years to keep canned foods and beverages fresh longer, “didn´t provide enough data to support a rule change,” he added.

“The information provided in your petition was not sufficient to persuade FDA, at this time, to initiate rulemaking to prohibit the use of BPA in human food and food packaging,” Acting Associate FDA Commissioner David H. Horsey wrote in a letter to, the Natural Resources Defense Council (NRDC), the New York-based advocacy group that filed the lawsuit in 2008, according to Kaskey.

At least trace amounts of BPA, which Bloomberg said is created by combining phenol and acetone, are present in the systems of a reported 90% of all US citizens. The National Institutes of Health (NIH) report that the substance could have a negative impact on the brains and prostates of fetuses and young children, and Kaskey added that some scientists believe that BPA can cause adverse effects on the reproductive and nervous systems, especially in infants and small children.

“Scientists are still working to determine what effects BPA, which mimics estrogen in the body, has on human health once ingested,” added Bettina Boxall and Eryn Brown of the Los Angeles Times. “They know that“¦ it has been shown to have negative effects in mice, including developmental and reproductive abnormalities, precancerous changes in the prostate and breasts, and other health problems. In epidemiological studies, researchers have reported correlations between BPA levels in people and higher risk of ailments including cardiovascular disease, diabetes and liver problems.”

Despite rejecting the NRDC’s petition, Boxall and Brown report that they have not completely ruled out regulating the substance in the near future. FDA spokesman Douglas Karas told the Times that the verdict was “not a final safety determination on BPA“¦ There is a commitment to doing a thorough evaluation of the risk of BPA.”

The Associated Press (AP) said that despite the findings regarding BPA in animals, they cannot necessarily be applied to humans, and that research provided by the NRDC were “too small to be conclusive.”

“The FDA is out-of-step with scientific and medical research,” Dr. Sarah Janssen, the NRDC´s senior scientist for public health, told the AP. “This illustrates the need for a major overhaul of how the government protects us against dangerous chemicals.”

Study Finds Conservative Trust In Science Declining

Conservative voters say that they have less confidence in the institution of science now than they did during the mid-1970s, claims a study published last Thursday in the American Sociological Review.

According to USNews.com reporter Jason Koebler, the study, which was conducted by University of North Carolina at Chapel Hill sociologist Gordon Gauchat, found that just 35% said that they had a “great deal of trust in science” in 2010. In comparison, in 1974, 48% said that they trusted the discipline of science.

“That represents a dramatic shift for conservatives, who in 1974 were more likely than liberals or moderates (all categories based on self-identification) to express confidence in science,” Scott Jaschik of Inside Higher Ed wrote on March 29. “While the confidence levels of other groups in science have been relatively stable, the conservative drop now means that group is the least likely to have confidence in science.”

Jaschik said that Gauchat’s findings are “significant” for both scientists and educational institutions that are attempting to garner support for their research projects amongst the public. The lack of confidence in science among conservatives also spills over into the political realm, especially in terms of the right’s attitude towards issues like climate change policy decisions in the context of the upcoming Presidential election.

“Science has always been politicized,” Gauchat wrote, according to John Timmer of Ars Technica . “What remains unclear is how political orientations shape public trust in science.”

The UNC researcher used information from the General Social Survey to gauge the public’s opinions regarding science, and his research uncovered other interesting trends as well, Timmer said. For most of the period dating back to the 1970s, Gauchat discovered that moderates, not conservatives, actually had the lowest confidence level in the discipline, while liberals tended to be the most trusting of science over the 30-plus year time span.

“The levels of trust for both these groups were fairly steady across the 34 years of data,” the Ars Technica reporter said. “Conservatives were the odd one out. At the very start of the survey in 1974, they actually had the highest confidence in scientific institutions. By the 1980s, however, they had dropped so that they had significantly less trust than liberals did; in recent years, they’ve become the least trusting of science of any political affiliation.”

The reason for this, Gauchat told Koebler, is both a rebellion by those with a conservative ideology against science, as well as the media and higher education, as well as a shifting in the priorities of science from space exploration and defense to regulatory policy such as climate change and environmental issues.

“The perception among conservatives is that they’re at a disadvantage, a minority. It’s not surprising that the conservative subculture would challenge what’s viewed as the dominant knowledge production groups in society — science and the media,” he told US News on Thursday.

Gauchat added that since the middle of the 20th century, “science has become autonomous from the government — it develops knowledge that helps regulate policy, and in the case of the EPA, it develops policy. Science is charged with what religion used to be charged with — answering questions about who we are and what we came from, what the world is about. We’re using it in American society to weigh in on political debates, and people are coming down on a specific side.”

Google’s April Fools Prank Features 8-bit Version Of Maps

Google got April Fool’s Day festivities off to an early start on Saturday, unveiling the new and improved 8-bit version of Google Maps as well as a promotional video announcing that the service would be coming to what Product Management Director Ken Tokusei calls “one of the most popular computer systems ever sold” — the Nintendo Entertainment System (NES).

To try out this new-and-improved old-school style version of the web application, look for the “Quest” feature on the right side of the screen. One a user clicks it, he or she will see the traditional political map ordinarily displayed be magically transformed into something out of a 1980s video game. It even comes complete with a little adventurer icon on the zoom-in/zoom-out bar and a classic gaming “d-pad” that is used to pan up, down, left or right.

Furthermore, Mashable’s Kate Freeman reported Saturday afternoon that Reddit users found some interesting little features hidden in the system. Among the Easter Eggs spotted by those users are an 8-bit rendition of the Parthenon in Greece and Area 51, complete with a UFO abducting a cow. Similar searches conducted by RedOrbit reporters returned a pixilated White House with two children in beanie hats nearby, a suitably tilted Leaning Tower of Pisa; the Kremlin, and other global landmarks.

Be careful before starting up this new version of Google Maps, though. According to David Murphy of PCMag.com, the company issues a firmly tongue-in-cheek warning that, “Your system may not meet the requirements for 8-bit computations.”

Converting the online app is one thing, but the company took it one step further, “announcing” plans to release a Google Maps cartridge for the long-obsolete NES home video game console system.

In a March 31 blog entry Google Maps Software Engineer Tatsuo Nomura wrote, “In our pursuit of new digital frontiers, we realized that we may have left behind a large number of users who couldn’t access Google Maps on their classic hardware. Surprisingly, the Nintendo Entertainment System (NES) was unsupported, despite its tremendous popularity with over 60 million units sold worldwide.”

“Our engineering team in Japan understood the importance of maps on retro game systems. With the power of Google´s immense data centers, and support from Nintendo and Square Enix, we were able to overcome the technical and design hurdles of developing 8-bit maps,” he added. “Today, we´re excited to announce the result: a version of Google Maps for NES, with beautiful low-res graphics, simple and intuitive controls, and a timeless soundtrack.”

The NES (or as it is known in Japan and throughout Asia, the Family Computer or Famicom) was launched in Japan in 1983, North America in 1985, Europe in 1986 and Australia in 1987. More than 60 million units were sold worldwide during the system’s heyday, and the NES launched such hit video games as the Super Mario Bros., Legend of Zelda, and Metroid, and Square Enix’s own Final Fantasy and Dragon Quest series. However, no mass-market software has been released for the system in years, and unless everyone is overestimating Google’s sense of humor, odds are that Google Maps won’t actually be changing that.

Just in case this should, by some wild and crazy twist, turn out to be real, CNET‘s Chris Matyszczyk reports that Google is also said to be planning a mobile version for the Game Boy.

Applesauce: All Things Apple

Evil-doers try to gather your bank information via iDevice, Tim Cook visits China, and a women literally falls over for a chance to buy an iPad. All this and more in this week´s edition of Applesauce!

Shake It Off
Apple stores are clean, brilliantly beautiful beacons of retail design. Steve Jobs made absolutely certain of this, choosing the stone for every store from one quarry in Italy and employing the use of glass at every turn.
As it turns out, however, not everyone is a fan of clean lines and transparent glass. Birds, for instance, have a longstanding disagreement with glass. As they swoop through the sky, held aloft by feather and hollow bones, they make those earth-bound creatures green with envy over their flight.
And then they run into a clear pane of glass.
I can only imagine Evelyn Paswall will tell a similar story when she takes Apple to court, suing them for one million dollars.
As Paswall tried to enter the Long Island Apple store, she didn´t realize the entire front facade of the store was glass and walked straight into the building, breaking her nose.
83 year old Paswall is seeking $75,000 in medical damages, plus a $25,000 slap on the wrist, just for having a building made of glass.
No self-respecting person wants to make fun of an injured elderly. There are a few things I wonder about, however“¦.
Did she expect the entire front of the building to be wide open? As in, no doors, no walls, no protection from the elements or ne´er-do-wells?
Were there no promotional materials in the front of the store advertising the iPhone 4S, MacBook Air, or iPad? (Every store I´ve ever been to in Texas always has these sort of displays at the front of their stores“¦)
Was this her very first visit to the Apple store? And why was she flying solo, without friend or family to share the moment with?
In response to this matter, Apple has installed white “warning strips” on the front glass walls to make it obvious there are giant panes of glass at the front of the store.
We sincerely wish Ms. Paswell all the best, health, and happiness. The question now is, will she be buying a new iPad with her newly found fortune?
Someone Else´s Song
Speaking of ne´er-do-wells,
Is no Apple device safe from wrongdoing miscreants and their internet hijinks?
English security firm Major Security discovered a bug in the mobile version of Safari, Apple´s web-browser, causing your URL bar to look as if you are visiting an actual, legitimate webpage when in actuality, you are visiting a dirty site, potentially giving away your bank information, health records, and a detailed list of every Google search you´ve ever committed.
Tricks like this aren´t entirely new“¦would-be internet wrong-doers can send you a link via email, Facebook, et al. The link looks like an actual website, but when you arrive to the page the URL isn´t “FreeBrownies.com” like you thought it would. Instead, it may say something like “IKnowWhatYouGoogledLastSummer.com”.
The spoofing flaw found by Major Security goes one step further. The link you click on will take you to “FreeBrownies.com,” and will even look like “FreeBrownies.com,” but really you be visiting a version of IKWYGLS.com. There, the thieves can steal information, leaving you alone and brownie-less.
Major Security alerted Apple to this flaw the same day they alerted the public, so be on the look out for an iOS update in the near future to address this issue.
Speaking of issues, here’s one concerning network connectivity“¦
At Least That´s What You Said
It´s only 4G if you have 4G.
Apple´s new iPad has been released to millions and millions of people all over the world. These proud new customers are able to shoot high-definition pictures and videos, edit said pictures on their crisp and clear high-resolution Retina display, and download videos wirelessly at lightening fast 4G speeds.
Well, users in North America, at least.
Sad news for those down under, as it turns out; Their 4G provider, Telstra, doesn´t operate their 4G on an iPad compatible frequency. American telecommunications companies Verizon and AT&T operate their 4G networks in the upper and lower ends of the 700Mhz spectrum, respectively. Telstra, however, operates on the 1,800 MHz spectrum. The iPad, it seems, only recognizes frequencies from the 700 and 2,100 frequencies, leaving Australia down under.
(That one hurt me as much as it hurt you)
As the hoards of new iPad owning Aussies tried to connect to Telstra’s 4G network, they were let down. Now the Australian Competition and Consumer Commission (ACCC) wants to take Apple to court over the matter, saying they misled their customers using false advertising. Not only is the ACCC asking for refunds for those customers left out of the 4G fun, they are also seeking to have Apple remove the word “4G” from their promotional material and branding as well as informative stickers to be placed on iPad boxes explaining why 4G doesn´t mean 4G in Australia.
Counsel for Apple said they would give out refunds, but does not expect many customers to request one.
Apple´s counsel has also said they are not prepared to remove any promotional “4G” branding or place stickers on their boxes.
Apple has since added a few qualifications to their website, saying, “4G LTE is supported only on AT&T and Verizon networks in the US and on Bell, Rogers, and Telus networks in Canada. See your carrier for details.”
Less Than You Think
Apple devices are always slim, relatively speaking. Often these devices are more than slim, they are deceptively slim. Think about what takes up room inside your iPhone, iPad, or better yet, MacBook Air. There is a lot that needs to be stuffed inside that sexy, sleek aluminum case. The battery needs to be large enough to last anywhere from 8 hours to an entire day, there needs to be enough space for the memory and chipsets, network antennas, bluetooth devices, cameras and the whole lot. Somehow, as if by some sort of English magic, Jonny Ive and his team have managed to make these components fit with a zen-like existence. Never content with themselves, however, Apple is always looking for ways to rid themselves of cruft, extra holes, extra weight, or extra buttons.
Enter, then, the patent war Apple is currently waging with the Fins from Nokia.
Apple has proposed a new standard for the next generation of SIM card to be used universally. Dubbed the “Nano-SIM,” Apple´s proposed prototype doesn´t need to be embedded onto a piece of plastic like the SIM cards of today. Nokia´s not happy about this, arguing their proposed standard is actually smaller than Apple´s.
(That´s right: Two companies are essentially arguing about whose is smaller.)
As both companies fight to convince the European Technologies Standards Institute (or ETSI for those in the know), Apple has said they will offer the technology free to carriers, not asking for a cent in return for licensing . This displeased Nokia, and they said if this were to happen, they would withhold any applicable patents from other companies.
But why would Apple propose a chip that doesn’t need to be embedded on a piece of plastic?
As it turns out, Apple´s proposed Nano-SIM can be directly embedded into the guts of the phone, without having to be removed.
Just let that one sink in.
A running tally of every Apple device within 10 feet of me reveals a pattern“¦Each device is closed up nice and tight, without (easy) access to the innards.
Could Apple be preparing themselves to finally circumvent the carriers, offering the iPhone without the need of carrier intervention? Perhaps they want to sell the iPhone directly through their own retail chain, without the need of third party vendors.
Apple likes clean lines and closed boxes, so it doesn´t seem too far out of the question…
Company in My Back
Your favorite CEO took a time out from counting his money to take a brief trip to China.
That line alone was enough to send the media into a frenzy.
Apple didn´t say exactly why Cook was in China, saying only he was visiting with “government officials.”
While he was there, he took a tour of the now infamous Foxconn factories, where many Apple products are made. You remember Foxconn: Hot bed of controversy made famous by headlines of suicidal workers and one loud-mouthed monologuist. Apple has been under some scrutiny lately where working conditions are concerned.
So let us don our guessing caps once more.
Was Mr. Cook there to ensure Apple´s policies on underage workers and reasonable hours were being enforced? Was he there simply for a photo-op or two for good press? Perhaps, (and in adherence with the rule of threes), was he there to have his pick of iPads straight off the line, as if it were some proverbial doughnut from Krispy Kreme®?
We do know while Cook was in China he also stopped by one of the Apple stores in the area to take a look around and be available for another photo-op or two.
All joking aside, this is a great PR move by Apple. Indicative  of a new era at Apple, they aren´t letting this scrutiny go by untended to. Tim Cook´s presence shows they care about their reputation and most importantly, the people hand-crafting their products.
Will Apple buy back all stock and go entirely private? Will they announce every iPhone, iPad, and Mac will be made in the USofA going forward? Will they steal Apollo 11´s F-1 engines from Jeff Bezos in a cruel and corporate game of keep away?
If any of these things happen, you can be sure to read about it here on RedOrbit.com!

Study: Birthing Mothers Spending More Time In Labor

Researchers with the U.S. National Institutes of Health (NIH) have discovered that women spend more time in labor today than they did five decades ago, various media outlets reported on Friday.

According to Amy Norton of Reuters, the NIH study found that American females spend approximately two to three hours longer, on average, in labor compared to 1960. Most of that extra time is spent in the first stage of labor, or the time during which the cervix opens until it is wide enough to allow the mother to begin pushing.

The results, which will be published in a future edition of the American Journal of Obstetrics and Gynecology (AJOG), were consistent, regardless of weight, age or ethnicity, though there was a tendency for modern mothers to be older and heavier than those who have birth 50 years ago, said MyHealthNewsDaily Staff Writer Rachael Rettner.

“Older maternal age and increased BMI (body-mass index, a ratio of weight to height) accounted for a part of the increase. We believe that some aspects of delivery-room practice are also responsible for this increase,” lead author and Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). Epidemiologist Dr. Katherine Laughon, told reporters, including HealthDay’s Steven Reinberg, during a conference call on Friday afternoon.

According to both Reinberg and Rettner, Laughon and her colleagues analyzed the birth data of nearly 40,000 women who had children between 1959 and 1966, as well as more than 98,000 women who delivered from 2002 through 2008. They discovered that 21st century women spent 2.6 hours longer in labor during their first birth and 2.0 hours longer during each subsequent birth, and that they were more likely to take epidurals and four times more likely to have a Cesarean delivery than their predecessors from the 1950s and 1960s.

“Women are in labor longer [today] because they are admitted [to the hospital] earlier,” Dr. Michael Cabbad, the chairman of obstetrics/gynecology as well as the chief of maternal/fetal medicine at The Brooklyn Hospital Center (TBH) in New York, told HealthDay on Friday. “There is a tendency for women to come to the hospital in an earlier phase of labor because of fear of arriving too late.”

The researchers report that more research is required to pinpoint all the factors responsible for this phenomenon. As Laughon told David, “We weren’t able to fully address the potential reasons with this study.”

On the Net:

NASA Claims Ownership Of Sunken Apollo Engines

On Friday, NASA officials claimed that they were the rightful owners of the Apollo 11 rocket engines found on the floor of the Atlantic Ocean earlier this week by a privately-funded expedition headed up by Amazon.com founder Jeff Bezos, CNN.com is reporting.

As previously reported here on RedOrbit, Bezos announced the discovery of the spacecraft’s F-1 engines that carried Neil Armstrong and Buzz Aldrin to the moon using deep-sea sonar equipment. The engines, which powered the Saturn V rocket that carried the Apollo 11 astronauts to the lunar service, were located approximately 14,000 feet below the surface.

Bezos said that he and his crew were “making plans to attempt to raise one or more of them from the ocean floor,” to which NASA Administrator Charles Bolden responded in a statement, congratulating the billionaire and his associated for their “historic” find and wishing them “all the luck in the world.”

However, according to CNN, Bolden also added, “NASA does retain ownership of any artifacts recovered and would likely offer one of the Saturn V F-1 engines to the Smithsonian Institution’s National Air and Space Museum  in Washington under longstanding arrangements with the institution.”

Bezos has reportedly requested that NASA allow one of the engines or a similar space artifact to be put on display in the Seattle Museum of Flight. The CNN wire reports say that it is possible that an engine could wind up at that location is the Smithsonian “declines” to take one of the F-1 engines or if a second one is recovered.

The Apollo 11 mission, arguably the most famous of all the Apollo missions, took place in July 1969. The Saturn V rocket with propelled the vehicle skyward was powered by a quintet of F-1 engines, generating a combined 7.5 million pounds of thrust before peeling off the rocket after a mere two minutes of use.

In a blog post detailing the efforts of retrieving the five engines, Bezos explained his fascination with the Apollo mission, saying, “Millions of people were inspired by the Apollo Program. I was five years old when I watched Apollo 11 unfold on television, and without any doubt it was a big contributor to my passions for science, engineering, and exploration. A year or so ago, I started to wonder, with the right team of undersea pros, could we find and potentially recover the F-1 engines that started mankind´s mission to the moon?”

He admitted that he and his crew “don’t know yet” what kind of condition the nine-ton, 32-million horsepower engines could be in — “they hit the ocean at high velocity and have been in salt water for more than 40 years. On the other hand, they´re made of tough stuff, so we´ll see.”

Robert Pearlman of CollectSpace.com, an online publication and community for space history enthusiasts, told CNN.com that raising the engines could pose a challenge, particularly if “all five of are still clumped together.” In that situation, he says, it would be “like trying to bring up the big part of the Titanic.”

Prior to launching the popular online retailer Amazon.com, Bezos worked in Wall Street, though he has long been a space enthusiast. After finding financial success with Amazon, Bezos started a company called Blue Origin. Blue Origin is one of several companies racing to bring private enterprise to space travel, “so that many people can afford to go and so that we humans can better continue exploring the solar system.”

On the Net:

Risk Of Depression Heightened By Sleep Disorder

A new study by researchers at the U.S. Centers for Disease Control and Prevention (CDC) found that people suffering from a common form of sleep disorder are also increasing their risk of depression, reports Health.com.

In the new research, men who are diagnosed with sleep apnea have been found to be more than twice as likely as other men to show signs of clinical depression. And researchers saw a bigger risk when looking at the link in women with sleep apnea: a fivefold risk increase of depression.

And this is only scratching the surface. CDC researchers believe that more than 80 percent of the people suffer some form of sleep apnea go undiagnosed, passing off symptoms as normal sleeping habits, which tend to include snorting or gasping for air during sleep. This group of people had a threefold risk of depression over those who had no trouble sleeping at night.

Carl Boethel, MD, a sleep specialist at Texas A&M Health Science Center College of Medicine, who was not involved in the research, said sleep apnea is way underdiagnosed. “Physicians in the sleep community and in the psychiatric community need to do a better job of screening and getting effective treatment,” he remarked.

Sleep apnea is a dangerous sleep disorder that, if gone untreated, can lead to other serious health problems such as diabetes, stroke, high blood pressure and congestive heart failure, as well as depression and anxiety. The causes of sleep apnea are attributed to oversized tonsils, airway structure, and excess fat around the windpipe, often due to overall obesity.

Because sleep apnea is often associated with obesity, the CDC scientists took into account body mass index in their analysis.

The study, appearing in the April issue of the journal SLEEP, is the first of its kind to look at a representative cross-section of the U.S. population. Data was taken from the National Health and Nutrition Examination Survey (NHANES), an annual survey conducted by the CDC.

Researchers found that 6 percent of men and 3 percent of women had received a sleep apnea diagnosis. They further found that 7 percent of men and 4 percent of women who had not received a diagnosis for sleep apnea had also reported breathing problems on at least 5 nights per week.

The team assessed depression using a standard questionnaire that asked participants how often in the past two weeks had they felt “little interest or pleasure in doing things” or had feelings of depression or hopelessness. The researchers found that five percent of men and eight percent of women had scores indicating “probable” depression.

Michael Weissberg, MD, co-director of the insomnia and sleep disorders clinic at the University of Colorado School of Medicine, in Denver, said one of the most complicating factors of the effects of sleep apnea and depression is the fact that it can be difficult to distinguish between them.

“There probably is an important connection between depression and sleep apnea, but it’s hard to sort out who has what,” Weissberg told Health.com. “Sleep disruption, particularly insomnia, can be a risk factor for developing depression, and a lot of symptoms of people who have sleep apnea — they feel lousy, they can’t think straight — are similar to symptoms people have in depression.”

The CDC said their study only offered an association and not cause and effect. They said they could not rule out the possibility that some other unknown factor may contribute to both sleep apnea and depression. However, they said it is plausible to think that sleep apnea could directly cause depression.

Lead author on the work, Anne Wheaton, PhD, an epidemiologist at the CDC, said previous research has shown a link between sleep apnea and mood swings. And the momentary drop in oxygen levels a person gets during an apnea could lead to changes in the brain by triggering stress or inflammation.

“Interrupted sleep may be associated with problems as far as what’s going on in the brain,” Wheaton says. “You need that steady sleep.”

One factor the CDC researchers ruled out affecting depression risk was snoring not attributed to sleep apnea. More than a third of men and 1 in 5 women who reported snoring at least five nights per week were no more likely to be depressed than those who never snored.

On the Net:

‘Walmart Of Weed’ Setting Up Shop In D.C.

One company known as the “Walmart of weed” is setting up shop in Washington D.C. a few miles from the White House and federal buildings.

The weGrow store will be opening on Friday in Washington, coinciding with the first concrete step in implementing city law to allow D.C. residents to purchase marijuana for medical reasons.

WeGrow said it provides the necessary tools to pioneers of a “green rush,” a movement it considers similar to the gold rush where nearly $9 billion could be made in the medical marijuana business in the next five years.

There have been 16 states that have decided to legalized marijuana for medical use to treat a variety of health problems, from anxiety and back pain to HIV/AIDS and cancer-ailments.

About 7 percent of Americans, or 17.4 million people, said they used marijuana in 2010, which is up from 5.8 percent in 2007, according to the Substance Abuse and Mental Health Services Administration.

A Gallup poll last year found that 50 percent of Americans say that marijuana should be made legal, and 70 percent support medical uses for pot.

WeGrow does not sell pot or seeds to grow it, but it instead provides products and services to help cultivators grow their own plants for personal use or for sale at dispensaries.

The first weGrow store was first opened in Sacramento last year by founder Dhar Mann.  He said he started the store after he was kicked out of a mom and pop hydroponics store in California just for mentioning marijuana.

Since its opening, the store has opened a location in Phoenix, San Jose and Flagstaff.  It has also expanded in New Jersey, Delaware, and Pennsylvania.

WeGrow said it has plans to expand its business into Oregon, Washington State and Michigan as well.

On Friday, D.C. officials will announce those who are eligible to apply for permits to grow and sell medical marijuana to dispensaries under the district’s 2010 law.  Applicants must sign a statement saying they understand a license does not authorize them to break federal law.

“They do so at their own peril because I can’t imagine that the federal government is going to allow marijuana selling for any purpose right in their backyard,” Kevin Sabet, a former senior adviser to the president’s drug czar and an assistant professor in the College of Medicine at the University of Florida, told the Houston Chronicle.

“Whether it’s D.C. or all the way out in California, the government’s been pretty clear that medical marijuana doesn’t pass the giggle test.”

On the Net:

When Dark Energy Turned On

The Sloan Digital Sky Survey (SDSS-III)  announced the most accurate measurements yet of the distances to galaxies in the faraway universe, giving an unprecedented look at the time when the universe first began to expand at an ever-increasing rate. Scientists from the University of Portsmouth and the Max-Planck Institute for Extraterrestrial Physics will present the new results in a press conference at 10:00 BST on Friday, March 30th at the National Astronomy Meeting in Manchester.

The results are available in six related papers posted to the arXiv preprint server and are the culmination of more than two years of work by the team of scientists and engineers behind the Baryon Oscillation Spectroscopic Survey (BOSS), one of the SDSS-III’s four component surveys.

“There’s been a lot of talk about using galaxy maps to find out what’s causing accelerating expansion,” says David Schlegel of the U.S. Department of Energy’s Lawrence Berkeley National Laboratory, the principal investigator of BOSS. “We’ve been making a map, and now we’re using it – starting to push our knowledge out to the distances when dark energy turned on.”

“The result is phenomenal,” says Will Percival, a professor at the University of Portsmouth in the United Kingdom, and one of the leaders of the analysis team. “We have only one-third of the data that BOSS will deliver, and that has already allowed us to measure how fast the Universe was expanding six billion years ago – to an accuracy of two percent.”

One of the most amazing discoveries of the last two decades in astronomy, recognized with the 2011 Nobel Prize in Physics, was that not only is our universe expanding, but that this expansion is accelerating – not only are galaxies are becoming farther apart from each other, they are becoming farther apart faster and faster.

What could be the cause of this accelerating expansion? The leading contender is a strange property of space dubbed “dark energy.” Another explanation, considered possible but less likely, is that at very large distances the force of gravity deviates from Einstein’s General Theory of Relativity and becomes repulsive.

Whether the answer to the puzzle of the accelerating universe is dark energy or modified gravity, the first step to finding that answer is to measure accurate distances to as many galaxies as possible. From those measurements, astronomers can trace out the history of the universe’s expansion.

BOSS is producing the most detailed map of the universe ever made, using a new custom-designed spectrograph of the SDSS 2.5-meter telescope at Apache Point Observatory in New Mexico. With this telescope and its new spectrograph, BOSS will measure spectra of more than a million galaxies over six years. The maps analyzed in today’s papers are based on data from the first year and a half of observations, and contain more than 250,000 galaxies. Some of these galaxies are so distant that their light has travelled more than six billion years to reach the earth – nearly half the age of the universe.

Maps of the universe like BOSS’s show that galaxies and clusters of galaxies are clumped together into walls and filaments, with giant voids in between them. These structures grew out of subtle variations in density in the early universe, which bore the imprint of “baryon acoustic oscillations” – pressure-driven (acoustic) waves that passed through the early universe.

Billions of years later, the record of these waves can still be read in our universe. “Because of the regularity of those ancient waves, there’s a slightly increased probability that any two galaxies today will be separated by about 500 million light-years, rather than 400 million or 600 million,” says Daniel Eisenstein of the Harvard-Smithsonian Center for Astrophysics, director of SDSS-III and a pioneer in baryon oscillation surveys for nearly a decade. In a graph of the number of galaxy pairs by separation distance, that magic number of 500 million light years shows up as a peak, so astronomers often speak of the “peak separation” between galaxies.  The distance that corresponds to this peak depends on the amount of dark energy in the Universe. But measuring the peak separation between galaxies depends critically on having the right distances to the galaxies in the first place.

That’s where BOSS comes in. “We’ve detected the peak separation more clearly than ever before,” says Nikhil Padmanabhan of Yale University, who along with Percival co-chairs the BOSS team’s galaxy clustering group. “These measurements allow us to determine the contents of the Universe with unprecedented accuracy.”

In addition to providing highly accurate distance measurements, the BOSS data also enable a stringent new test of General Relativity, explains Beth Reid, a NASA Hubble Fellow at Lawrence Berkeley National Laboratory. “Since gravity attracts, galaxies at the edges of galaxy clusters fall in toward the centres of the clusters,” Reid says. “General Relativity predicts just how fast they should be falling. If our understanding of General Relativity is incomplete, we should be able to tell from the shapes we see in BOSS’s maps near known galaxy clusters.”

Reid led the analysis of these “redshift space distortions” in BOSS. After accounting for the effects of dark energy, Reid’s team found that the rate at which galaxies fall into clusters is consistent with Einstein’s predictions. “We already knew that the predictions of General Relativity are extremely accurate for distances within the solar system,” says Reid, “and now we can say that they are accurate for distances of 100 million light-years.

We’re looking a billion times further away than Einstein looked when he tested his theory, but it still seems to work.”

What’s amazing about these results – six papers covering the measurements of cosmic distance and the role of gravity in galaxy clustering – is that they all come together to tell the same story. “All the different lines of evidence point to the same explanation,” says Ariel Sanchez, a research scientist at the Max Planck Institute for Extraterrestrial Physics in Garching, Germany, and lead author on one of the papers. “Ordinary matter is only a few percent of the universe. The largest component of the universe is dark energy – an irreducible energy associated with space itself that is causing the expansion of the universe to accelerate.”

But this is just the beginning, says BOSS principal investigator Schlegel. “For the past 13 years, we’ve had a simple model of how dark energy works. But the truth is, we only have a little bit of data, and we’re just beginning to explore the times when dark energy turned on. If there are surprises lurking out there, we expect to find them.”

Image Caption: The record of baryon acoustic oscillations (white rings) in galaxy maps helps astronomers retrace the history of the expanding universe. (Credit: E. M. Huff, the SDSS-III team, and the South Pole Telescope team. Graphic by Zosia Rostomian.)

On the Net:

DNA Traces Cattle Back To Ancient Times

A new genetic study confirms that modern domesticated cattle are descended from 80 domesticated wild oxen in the Near East over 10,500 years ago.

Scientists from CNRS, the National Museum of Natural History in France, the University of Mainz in Germany, and University College London (UCL) in the UK performed the study by extracting DNA from the bones of extracted domestic cattle found in Iran. These sites are believed to date back to the invention of farming and the area where cattle first became domesticated.

The international team of scientists found only small differences in the DNA from the Iranian excavation and modern day domestic cattle. What little difference there is, say the scientists, could come from different population histories. By analyzing the DNA with computer simulations, they believe the differences in DNA could only have arisen if a small number of animals were domesticated from the wild ox. They believe as few as 80 wild oxen are responsible for what we know now as modern, domestic cattle.

Results of this study are published in the current issue of the journal Molecular Biology and Evolution.

In a press release announcing the results, Dr Ruth Bollongino of CNRS, France, and the University of Mainz, Germany; lead author of the study, said, “Getting reliable DNA sequences from remains found in cold environments is routine. That is why mammoths were one of the first extinct species to have their DNA read. But getting reliable DNA from bones found in hot regions is much more difficult because temperature is so critical for DNA survival. This meant we had to be extremely careful that we did not end up reading contaminating DNA sequences from living, or only recently dead cattle.”

According to the scientists, it is important to the archeological study of domestication to note the number of original animals.

Prof Mark Thomas, geneticist and an author of the study based at the UCL Research Department of Genetics, Evolution and Environment: “This is a surprisingly small number of cattle. We know from archaeological remains that the wild ancestors of modern-day cattle, known as aurochs, were common throughout Asia and Europe, so there would have been plenty of opportunities to capture and domesticate them.”

Based at the University of Mainz, Germany, Professor Joachim Burger had this to say about the wild ox, “Wild aurochs are very different beasts from modern domestic cattle. They were much bigger than modern cattle, and wouldn´t have had the domestic traits we see today, such as docility. So capturing these animals in the first place would not have been easy, and even if some people did manage snare them alive, their continued management and breeding would still have presented considerable challenges until they had been bred for smaller size and more docile behavior.”

Other archeological studies have shown other animals, not just cattle, were domesticated. Goats, sheep, and pigs have also been found to be domesticated in the Near East around the beginning of the farming age. It is much harder, however, to determine how many of these animals were domesticated. While traditional archeological techniques do not provide an entire picture as to how many animals were domesticated, genetic research helps to fill in these gaps.

Dr Marjan Mashkour, a CNRS Archaeologist working in the Middle East added “This study highlights how important it can be to consider archaeological remains from less well-studied regions, such as Iran. Without our Iranian data it would have been very difficult to draw our conclusions, even though they concern cattle at a global scale”.

Man Recovering Well After Full-face Transplant

Brett Smith for Redorbit.com

In what can be described as a scenario straight out of a Hollywood movie, hospital officials report that Richard Lee Norris, of Hillsville, Va., is recovering well after a 36-hour full-face transplant surgery that not only gave him a new appearance, but also functioning teeth, tongue, and a new set of jaws.

Eduardo Rodriguez, the lead surgeon for the operation, said he hoped the transplant would give Norris his life back after living for years as a recluse. Norris was shot in the face in 1997 and had multiple life-saving and reconstructive surgeries since then. While these surgeries allowed him to continue living, his appearance prevented him from truly returning to society or holding a job. Before the surgery, Norris wore a prosthetic nose and a mask whenever he left his parents´ house where he currently resides.

The transplanted face came from an anonymous donor, whose family had been consulted specifically about donating the face along with several other life-saving organs.  Rodriguez said Norris´s new face did not resemble the donor´s and is “a combination of two individuals, a true blend.”

Over 150 doctors, nurses, and medical staffers were involved in the groundbreaking operation. The procedure, which used innovative surgical and computerized techniques, was designed to give Norris a more socially acceptable face with complete functionality. Before the surgery, Norris had no nose and only part of his tongue. He was able to taste food but could not smell it.

“He could not smell for the past 15 years, and that was the most dramatic thing: immediately, on day three, he could finally smell,” Rodriguez said according to The Guardian.

There have been several concerns with the procedure in the past both ethical and medical. These concerns primarily revolve around the possibility of the transplant failing. Ethicists critical of the procedure say improperly functioning facial muscles could result in a worse situation for the patient. Experts, like Iain Hutchison, of Barts and the London Hospital, who criticize the procedure based on medical grounds, say blood vessels in the donated tissue could clot, immunosuppressants could fail, or the procedure might increase the patient’s risk of cancer.

Researchers at the University of Maryland found that transplants involving a large amount of bone marrow with its own blood supply saw fewer or no rejections of transplanted tissue, Norris´s surgeon Rolf Barth said. Norris will have to take immunosuppressant drugs for the rest of his life, but the jawbone transplant could mean a lower risk of rejection, according to Barth.

The surgery was also enabled by research funded by the Department of Defense’s Office of Naval Research. In an attempt to improve the quality of reconstructive surgery for injured veterans, the department has invested in both hand and facial surgery research. The government estimates that 200 troops might be eligible for face transplants.

This was the 23rd face transplant since doctors began doing the procedure seven years ago. The first full face transplant was performed in France in 2005 on a woman who was mauled by a Labrador. During that operation, teams of doctors grafted a nose, lips, and a chin onto the face of the 38-year-old mother of two.

Engineers Working Towards Laser Deflection Of Asteroids

Protecting the Earth from an extinction-threatening asteroid has in the past been left up to Hollywood, which has deployed a barrage of special effects box office blockbusters that have kept deadly space rocks from smashing our planet into oblivion. But now, Scottish engineers have come up with a plausible way of dealing with asteroids that doesn´t include Hollywood special effects.
Engineers at the University of Strathclyde in Glasgow are developing an innovative technique based on lasers that could literally change asteroid deflection technology as we know it. The scientists envision a swarm of formation-flying satellite drones equipped with powerful lasers that would work together to blow asteroids out of this world.
Until now, researchers have mainly focused on huge satellites carrying massive weapons that could potentially save planet Earth from an asteroid impact.
But the Strathclyde researchers believe the answer lies in smaller ℠fighter´ satellites working together to destroy a huge predator from space.
The researchers, led by Dr. Massimiliano Vasile, of Strathclyde´s Department of Mechanical and Aerospace Engineering, hope to prove that their technology is both safe and effective. They said the use of high power lasers in space for civil and commercial applications were limited and there are challenges in trying to achieve the high power, high efficiency and high beam quality all at the same time.
“The additional problem with asteroid deflection is that when the laser begins to break down the surface of the object, the plume of gas and debris impinges the spacecraft and contaminates the laser,” added Massimiliano. “However, our laboratory tests have proven that the level of contamination is less than expected and the laser could continue to function for longer than anticipated.”
Asteroid impacts aren´t a new threat to Earth. Impacts have occurred rather frequently in the planet´s past, and occasionally, large life-threatening asteroids have caused worldwide devastation, such as the asteroid that wiped out the dinosaurs 65 million years ago.
A closer event than that occurred just over a century ago. Although nowhere near as devastating or life-threatening, the Tunguska meteorite explosion/impact in Siberia that devastated a 1,250-mile area showed just how much damage can be caused. The Tunguska meteorite was believed to be around 100 to 150 feet in diameter.
“The Tunguska class of events are expected to occur within a period of a few centuries. Smaller asteroids collide with Earth more frequently and generally burn in the atmosphere although some of them reach the ground or explode at low altitude potentially causing damage to buildings and people,” said Vasile.
Vasile said their technology is much more feasible than larger satellites carrying massive weapons.
The system is scalable, so that more satellites could be added if a threat is larger. And the system would still continue to carry out its tasks if one happens to fail.
Vasile said the same technique could be used for clearing space debris from orbit, reducing congestion, as the amount of debris littering the planet´s orbit is ever-increasing.
“The amount of debris in orbit is such that we might experience a so called Kessler syndrome — this is when the density becomes so high that collisions between objects could cause an exponentially increasing cascade of other collisions,” added Vasile.
“While there is significant monitoring in place to keep track of these objects, there is no specific system in place to remove them and our research could be a possible solution,” he said. “A major advantage of using our technique is that the laser does not have to be fired from the ground.”
“Obviously there are severe restrictions with that process as it has to travel through the atmosphere, has a constrained range of action and can hit the debris only for short arcs,” said Vasile.
The research was carried out in collaboration with the University of Strathclyde´s Institute of Photonics and was presented to the Planetary Society at the end of February.
Image Courtesy Gl0ck / Shutterstock

Starbucks Uses Bugs To Color Frappuccino

A statement released by Starbucks to the Daily Mail yesterday revealed a shocking ingredient in their iconic Frappuccino beverages: ground-up bugs.
Cochineal extract is a popular food additive created by crushing up bugs and is most often used to lend foods a red color.
Starbucks admitted to using cochineal extract in their famous Strawberries and Cream Frappuccino. The extract is found in the strawberry puree used to create the beverage. Starbucks is claiming the extract helps bring out the bright pink color in the beverage, as well as adhere to their goal to use only all-natural products.
As disgusting as it may sound, bugs are considered “All-natural.”
In June 2009, Starbucks announced a new initiative to “improve the taste of its food and focus on real, wholesome ingredients.” Part of this initiative included removing artificial flavors, dyes, and high fructose corn syrup. The strawberry syrup in question was changed when this new all-natural initiative went into place. Previously using high fructose corn syrup and other dyes and sweeteners, the old strawberry syrup had a darker red color. The new syrup, now called a “puree,” contains no artificial sweeteners or dyes. As such, the color changed to a light pink as opposed to a dark red. Starbucks recently admitted to the use of cochineal to bring out the desired color in their Frappuccino. In a press release from 2009 announcing the new changes, registered dietician for Starbucks, Katie Thomson said, “At Starbucks, our salads and smoothies were designed to deliver real nutrition and great flavors at a sensible calorie level.”
Though cochineal is a by-product of bug parts, it actually contains no bugs. The food additive is actually the chemical extract of carminic acid from the bodies of squished female “scale” insects. These insects use the carminic acid to repel ants and other predators. As these insects are squished to remove the carminic acid, the bug parts are strained out, leaving behind a bright, red extract.
Starbucks isn´t the first to use the buggy extract to color their food products. If you begin to examine the labels on your favorite foods, chances are you´ll find it with some regularity. Cochineal is known as carmine, carminic acid, natural red 4, or E120. It is used in all sorts of foods such as sausage, yogurts, juices, artificial crab, and even lipstick and blush. Cochineal is often found in anything claiming to be “strawberry” flavored and is even quite popular in February as Valentine’s Day rolls around. In addition to making all sorts of foods bright, rosy red, it is also what gives devil´s food cake its devilish red color.
Those worried if cochineal or carminic acid is safe have little worry about. While it´s never a bad idea to check the labels, only a handful of people have reported an allergy to the bug extract. The FDA had this to say about the effects of cochineal and labeling: “we identified three adverse events over an approximately 10-year period that involved products containing carmine or cochineal extract in which those color additives did not or probably did not appear on the ingredient list“¦.We applied a reporting rate of 1 percent to this figure to obtain our estimate of 31 adverse events per year.”
As to whether or not the Frappuccino is vegan, Starbucks told GlobalPost, “we have the goal to minimize artificial ingredients in our products. While the strawberry base isn´t a vegan product, it helps us move away from artificial dyes.”

Image Caption: A cluster of females scale insects. Credit: Frank Vincentz/Wikipedia (CC BY-SA 3.0)

Researchers Discover A New Path For Light Through Metal

Novel plasmonic material may merge photonic and electronic technologies

Helping bridge the gap between photonics and electronics, researchers from Purdue University have coaxed a thin film of titanium nitride into transporting plasmons, tiny electron excitations coupled to light that can direct and manipulate optical signals on the nanoscale. Titanium nitride’s addition to the short list of surface-plasmon-supporting materials, formerly comprised only of metals, could point the way to a new class of optoelectronic devices with unprecedented speed and efficiency.

“We have found that titanium nitride is a promising candidate for an entirely new class of technologies based on plasmonics and metamaterials,” said Alexandra Boltasseva, a researcher at Purdue and an author on a paper published today in the Optical Society’s (OSA (http://www.osa.org)) open-access journal Optical Materials Express (http://www.opticsinfobase.org/ome). “This is particularly compelling because surface plasmons resolve a basic mismatch between wavelength-scale optical devices and the much smaller components of integrated electronic circuits.”

Value of Plasmons

Metals carry electricity with ease, but normally do nothing to transmit light waves. Surface plasmons, unusual light-coupled oscillations that form on the surface of metallic materials, are the exception to that rule. When excited on the surface of metals by light waves of specific frequencies, plasmons are able to retain that same frequency, but with wavelengths that are orders-of-magnitude smaller, cramming visible and near-infrared light into the realm of the nanoscale.

In the world of electronics and optics, that 100-fold contraction is a boon. Circuits that direct the paths of electrons operate on a much smaller scale than optical light waves, so engineers must either rely on small but relatively sluggish electrons for information processing or bulk up to accommodate the zippy photons. Plasmons represent the best of both worlds and are already at the heart of a number of optoelectronic devices. They have not had widespread use, however, due to the dearth of materials that readily generate them and the fact that metals, in most cases, cannot be integrated with semiconductor devices.

Plasmonic Materials

Until now, the best candidates for plasmonic materials were gold and silver. These noble metals, however, are not compatible with standard silicon manufacturing technologies, limiting their use in commercial products. Silver is the metal with the best optical and surface plasmon properties, but it forms grainy, or semi-continuous, thin films. Silver also easily degrades in air, which causes loss of optical signal, making it a less-attractive material in plasmon technologies.

In an effort to overcome these drawbacks, Boltasseva and her team chose to study titanium nitride–a ceramic material that is commonly used as a barrier metal in microelectronics and to coat metal surfaces such as medical implants or machine tooling parts–because they could manipulate its properties in the manufacturing process. It also could be easily integrated into silicon products, and grown crystal-by-crystal, forming highly uniform, ultrathin films–properties that metals do not share.

To test its plasmonic capabilities, the researchers deposited a very thin, very even film of titanium nitride on a sapphire surface. They were able to confirm that titanium nitride supported the propagation of surface plasmons almost as efficiently as gold. Silver, under perfect conditions, was still more efficient for plasmonic applications, but its acknowledged signal loss limited its practical applications.

To further improve the performance of titanium nitride, the researchers are now looking into a manufacturing method known as molecular beam epitaxy, which would enable them to grow the films and layered structures known as superlattices crystal-by-crystal.

Technologies and Potential Applications

In addition to plasmonics, the researchers also speculate that titanium nitride may have applications in metamaterials, which are engineered materials that can be tailored for almost any application because of their extraordinary response to electromagnetic, acoustic, and thermal waves. Recently proposed applications of metamaterials include invisibility cloaks, optical black holes, nanoscale optics, data storage, and quantum information processing.

The search for alternatives to noble metals with improved optical properties, easier fabrication and integration capabilities could ultimately lead to real-life applications for plasmonics and metamaterials.

“Plasmonics is an important technology for nanoscale optical circuits, sensing, and data storage because it can focus light down to nanoscale,” notes Boltasseva. “Titanium nitride is a promising candidate in the near-infrared and visible wavelength ranges. Unlike gold and silver, titanium nitride is compatible with standard semiconductor manufacturing technology and provides many advantages in its nanofabrication and integration.”

According to the researchers, titanium nitride-based devices could provide nearly the same performance for some plasmonic applications. While noble metals like silver would still be the best choice for specific applications like negative index metamaterials, titanium nitride could outperform noble metals in other metamaterial and transformation optics devices, such as those based on hyperbolic metamaterials.

On the Net:

NIH Study Shows Survival Advantage For Bypass Surgery Compared With Non-Surgical Procedure

Project analyzed data from over 189,000 older adults

A new comparative effectiveness study found older adults with stable coronary heart disease (CHD) who underwent bypass surgery had better long-term survival rates than those who underwent a non-surgical procedure to improve blood flow to the heart muscle, also called revascularization.

The National Institutes of Health-supported study compared a type of surgery known as coronary artery bypass graft (CABG) with a non-surgical procedure known as percutaneous coronary intervention (PCI). While there were no survival differences between the two groups after one year, after four years the CABG group had a 21 percent lower mortality.

Principal investigator William Weintraub, M.D., of Christiana Care Health System in Newark, Del., and colleagues will present these findings on Tuesday, March 27, at 9 a.m. EDT, at the American College of Cardiology’s annual meeting in Chicago. The findings will appear online today in the New England Journal of Medicine and in the April 19 print issue. Two companion papers that describe the statistical prediction models used to forecast long-term survival rates will also appear in today’s print issue of Circulation.

“In the United States, cardiologists perform over a million revascularization procedures a year to open blocked arteries. This study provides comprehensive, large-scale, national data to help doctors and patients decide between these two treatments,” said Susan B. Shurin, M.D., acting director of the NIH’s National Heart, Lung, and Blood Institute (NHLBI), which funded the study.

Comparative effectiveness research results provide information to help patients and health care providers decide which practices are most likely to offer the best approach for a particular patient, what the timing of interventions should be, and the best setting for providing care.

In CHD, also called coronary artery disease, plaque builds up inside the coronary arteries that supply blood to the heart muscle. Over time, blocked or reduced blood flow to the heart muscle may occur, resulting in chest pain, heart attack, heart failure, or erratic heart beats. Each year, more than half a million Americans die from CHD.

In CABG, or bypass surgery, the most common type of heart surgery in the United States, blood flow to the heart muscle is improved by using (“grafting”) a healthy artery or vein from another part of the body to bypass the blocked coronary artery.

PCI is a less invasive, non-surgical procedure in which blocked arteries are opened with a balloon (also called angioplasty). A stent, or small mesh tube, is then usually placed in the opened arteries to allow blood to continue to flow into the heart muscle.

With NHLBI support, the American College of Cardiology Foundation (ACCF) and the Society of Thoracic Surgeons (STS) came together to compare short- and long-term survival outcomes after CABG versus PCI. The investigators linked medical data available in their ACCF and STS databases with follow-up information in the Medicare Provider Analysis and Review database of the Centers for Medicare and Medicaid Services.

Linking these three datasets from 644 U.S. hospitals allowed researchers to analyze information from the STS database on 86,244 older adults (average age 74) with stable CHD who underwent CABG between 2004 and 2007 and 103,549 older adults (average age 74) with stable CHD from the ACCF database who underwent PCI between 2004 and 2007. Follow-up ranged from one to five years, with an average of 2.72 years.

At one year there was no difference in deaths between the groups (6.55 percent for PCI versus 6.24 percent for CABG). However, at four years there was a lower mortality with CABG than with PCI (16.41 percent versus 20.80 percent). This long-term survival advantage after CABG was consistent across multiple subgroups based on gender, age, race, diabetes, body mass index, prior heart attack history, number of blocked coronary vessels, and other characteristics. For example, the insulin-dependent diabetes subgroup that received CABG had a 28 percent increased chance of survival after four years compared with the PCI group.

“This landmark data-sharing collaboration between the American College of Cardiology Foundation, the Society of Thoracic Surgeons, and the Duke Clinical Research Institute allowed researchers to conduct the most comprehensive real-world observational comparative effectiveness study on this topic to date,” said Michael Lauer, M.D., director of the NHLBI Division of Cardiovascular Sciences.

On the Net:

Wide Variation In Emergency Service Response To Elderly Falls Patients

Elderly falls: A national survey of UK ambulance services

The ambulance service response to emergency calls for elderly falls patients varies widely across the UK, reveals research published online in Emergency Medicine Journal.

Falls are the principal cause of injury among those aged over 65, with around one in three in this age group sustaining a fall every year, say the authors.

And in London alone one in 12 emergency calls for ambulance services are made for older people who have fallen.

The authors surveyed all 13 UK ambulance trusts about their response to all categories of emergency calls received for people suspected of having had a fall.

Eleven (85%) of the 13 trusts responded. And the responses showed that ambulance services have dedicated considerable resource to handling these types of calls.

But the responses also show that the provision of care varies widely across the trusts, and it is unclear what works best and represents the best value for money.

All of the ambulance trusts had set up systems to transfer emergency calls involving elderly falls patients to phone based clinical advisors.

One service additionally deployed a triage system to categorize the urgency of the call, with those considered to be less urgent referred to a dedicated “falls team” to be dealt with later. Two other services said they had plans to implement similar schemes.

Seven services had local response mechanisms in place for calls placed from personal alarm services; one service had plans in place to adopt a similar scheme.

All the services deployed specially trained healthcare workers, such as emergency care practitioners, to respond to calls for elderly falls patients.

But seven services dispatched vehicles that were not crewed by emergency technicians or paramedics, while all 11 services said they sent vehicles crewed by just one member of staff to older patients who had fallen.

One service was testing out the deployment of non-clinical staff while another had a specialist falls response ambulance, crewed by a paramedic and a social worker. The proportion of patients left at home ranged from just 7% to 65% for nine of the services, with only two services achieving a proportion below 42%.

Referrals to other services were made by various different categories of staff, while the method of making the referrals also varied, with some made at the scene, others from base stations, and others from the communications room.

Several trusts said there were restrictions on the type of referral they could make and to whom/where; not all staff had been given additional training in this area.

The authors point out that their findings show that UK ambulance services have gone to some lengths to ensure that elderly falls patients do not have to endure delays in response, which are an inevitable consequence of rising demand on these services.

“However, although service innovation for falls is widespread, clinically effective and cost effective service models are yet to be developed,” they write.

Emphasizing the wide variations in provision of care, the authors conclude: “These findings highlight the urgent need for research to inform policy, service and practice development for the large and frail population of older people who have fallen and for whom a 999 call has been made.”

On the Net:

Special American Chemical Society Symposium On Communicating Science To The Public

With an understanding of science and technology growing ever more important for full participation in a democratic society, the world’s largest scientific society today is holding a special symposium on how scientists can better communicate their work to the public.

The American Chemical Society (ACS), which has more than 164,000 members, will host the event, “Communicating Chemistry to the Public,” as part of its 243rd National Meeting & Exposition, being held here. It begins at 1 p.m. in Room 4 on the Upper Level of the San Diego Convention Center.

ACS President Bassam Z. Shakhashiri, Ph.D., originated the symposium, which will feature a panel of noted science journalists, book authors and communicators, and will deliver the opening and closing remarks. Cheryl Frech, Ph.D., chair of the ACS Committee on Public Relations and Communications and professor at the University of Central Oklahoma in Edmond, will moderate the event.

Communicating science is a major part of Shakhashiri’s presidential theme for the year. The William T. Evjue Distinguished Chair for the Wisconsin Idea at the University of Wisconsin-Madison, Shakhashiri is noted internationally for pioneering the use of demonstrations in the teaching of chemistry in classrooms, as well as to the public in museums, convention centers, shopping malls and retirement homes – and at his Science is Fun website. The Encyclopedia Britannica termed Shakhashiri the “dean of lecture demonstrators in America.” He received the prestigious National Science Board’s Public Service Award in 2007 for pioneering new ways to encourage public understanding of science.

“Communicating science and its values and role in society to the public is one of the American Chemical Society’s core functions,” Shakhashiri noted. “We must engage the general public and show that chemistry and related sciences are a major part of the engines that drive our economy and contribute to prosperity, fairness and justice. I am delighted that this panel of outstanding communicators can appear at the ACS National Meeting to share their experiences and insights.”

Shakhashiri convened a similar symposium at the ACS’ 242nd National Meeting & Exposition in Denver last year. Speakers at today’s event include:

    Paul Raeburn, distinguished science writer, Knight Foundation media critic and winner of the ACS 2012 James T. Grady-James H. Stack Award for Interpreting Chemistry for the Public, will explain how the advent of science blogs has affected coverage of scientific research — including politically sensitive topics — and will try to predict where journalism is headed.
    Sam Kean, author of the New York Times bestseller The Disappearing Spoon, will discuss true tales from the periodic table, like why Ghandi hated iodine and how tellurium led to the most bizarre gold rush in history, as he traces the roles of the elements in history, finance, mythology, alchemy, war, art and the lives of the scientists who discovered them.
    K.C. Cole, long-time science writer for the Los Angeles Times, The New York Times, and other publications and professor at the University of Southern California’s Annenberg School for Journalism will discuss “Lost in Translation: Why Lies, Metaphors and Mixed Messages are Essential to Communicating Science.”
    Carmen Drahl, Ph.D., associate editor for Science, Technology and Education at Chemical & Engineering News, will explain how ACS’ weekly newsmagazine helps bring chemistry news to media outlets with a broader reach, and how it is leveraging new media in its role as middleman between chemistry research and the media.
    Mariette DiChristina, editor in chief of Scientific American, will talk about the swiftly changing world of media in the digital age and how the longest continuously published magazine in the U.S. adapts and thrives in 2012. She will ask what these changes mean for scientists and the public and look at what’s ahead for magazines.
    Joann Rodgers, long-time executive director of media relations and public affairs at Johns Hopkins Medicine, and now senior communications adviser at the Johns Hopkins Berman Institute of Bioethics, will discuss what research scientists can do to get their research noticed in an era of fragmented media, distracted audiences with poor scientific literacy and heightened focus on accountability and conflicts of interest.
    Pam Sturner, executive director of the Leopold Leadership Program at the Woods Institute for the Environment at Stanford University, will describe a new type of scientific leader, defined by innovation, vision and the ability to bridge multiple audiences and stakeholder groups, who will utilize the skills of science communication to succeed.

On the Net:

First Evidence Of Space Weathering Found On Comet Wild2

The traditional picture of comets as cold, icy, unchanging bodies throughout their history is being reappraised in the light of analyses of dust grains from Comet Wild2. A team led by the University of Leicester has detected the presence of iron in a dust grain, evidence of space weathering that could explain the rusty reddish colour of Wild2´s outer surface. The results will be presented by Dr John Bridges at the National Astronomy Meeting in Manchester on Tuesday 27th March.

The Wild2 grains were collected by the NASA Stardust mission and returned to Earth in 2006. The fast-moving dust grains were collected in arrays of aerogel, a silicon-based foam that is 99 per cent empty space, which slowed the particles from velocities of 6 kilometres a second to a halt over just a few millimetres. Since then, an international team of scientists has been analysing the samples and the carrot-shaped tracks that they left in the aerogel. Microscopic samples dissected from the grains have been analysed at facilities around the UK, and in particular this work was performed at the Diamond Light Source synchrotron in Oxfordshire and Leicester University. Through a range of analytical techniques, scientists in the UK have been able to fully analyse the mineralogy and isotopes of the samples.

“The total mass of Comet Wild2 grains returned is less than a milligram, so these samples are incredibly precious and a considerable analytical challenge,” said Bridges, of the University of Leicester.

The analysis from the Microfocus Spectroscopy beamline at the Diamond synchrotron shows that the surface of Comet Wild2 has been bombarded by particles in the solar wind and micrometeorites throughout its 4.5 billion year history. This space weathering has deposited nanometre-size grains of iron metal and reddened the surface of the comet.

This is the first mineralogical evidence for space weathering that has been identified in the Wild2 samples that was hinted at by other spectroscopic observations of the comet,” said Bridges. “It adds another piece of the puzzle to our understanding of the life history of comets.”

Image Caption: This is an artist’s concept depicting a view of comet Wild 2 as seen from NASA’s Stardust spacecraft during its flyby of the comet on Jan. 2, 2004.

On the Net:

Cassini Makes Simultaneous Measurements Of Saturn’s Nightside Aurora And Electric Current System

Since the NASA / ESA Cassini-Huygens spacecraft arrived at Saturn in 2004, astronomers and space scientists have been able to study the ringed planet and its moons in great detail. Now, for the first time, a team of planetary scientists have made simultaneous measurements of Saturn´s nightside aurora, magnetic field, and associated charged particles. Together the fields and particle data provide information on the electric currents flowing that produce the emissions. Team leader Dr Emma Bunce of the University of Leicester will present the new work at the National Astronomy Meeting in Manchester on 27 March 2012.

Generally, images of the aurora (equivalent to the terrestrial ℠northern lights´) provide valuable information about the electromagnetic connection between the solar wind, the planet´s magnetic field (magnetosphere) and its upper atmosphere. Variations in the aurora then provide information on changes in the associated magnetosphere. But viewing the aurora (best done at a large distance) at the same time as measuring the magnetic field and charged particles at high latitudes (where the aurora is found, best done close to the planet) is hard

In 2009, Cassini made a crossing of the magnetic field tubes that connect to the aurora on the night side of Saturn. Because of the position of the spacecraft, Dr Bunce and her team were able to obtain ultraviolet images of the aurora (which manifests itself as a complete oval around each pole of the planet) at the same time.

This is the first time it has been possible to make a direct comparison between Cassini images of the nightside aurora and the magnetic field and particle measurements made by the spacecraft. And because of the geometry of the orbit at Cassini, it took about 11 hours to pass through the high-latitude region or about the same time it takes Saturn to make one rotation.

This meant that the team were able to watch the auroral oval move as the planet turned. As Saturn and its magnetosphere rotated, the auroral oval was tilted back and forth across the spacecraft with a speed that is consistent with a planetary rotation effect:

Dr Bunce comments: “With these observations we can see the simultaneous motion of the electric current systems connecting the magnetosphere to the atmosphere, producing the aurora. Ultimately these observations bring us a step closer to understanding the complexities of Saturn´s magnetosphere and its ever elusive rotation period”.

Image Caption: Two images of Saturn´s northern auroral oval, made with the Ultraviolet Imaging Spectrometer (UVIS) instrument. The second image, made two hours after the first, shows the motion of the oval as the planet rotates. Credit: NASA / ESA and the Cassini UVIS team

On the Net:

Penn Study Reveals Safety Of CT Scans For Rapid Rule Out Of Heart Attacks In ER Chest Pain Patients

A highly detailed CT scan of the heart can safely and quickly rule out the possibility of a heart attack among many patients who come to hospital emergency rooms with chest pain, according to the results of a study that will be presented by researchers from the Perelman School of Medicine at the University of Pennsylvania today at the American College of Cardiology’s 61st Annual Scientific Session and published concurrently in the New England Journal of Medicine. The multicenter randomized trial comparing coronary CT angiography (CCTA) and traditional cardiac testing methods revealed that chest pain patients with negative CT scans can be discharged safely from the hospital within hours. The findings may offer a new strategy for relieving the emergency room crowding that plagues many of America’s hospitals, and could help to trim millions of dollars off the costs of care for one of the leading causes of ER visits.

Chest pain is the second most common reason people go the emergency room in the United States, accounting for as many as 8 million visits each year at a cost of several billion dollars. Just 5 to 15 percent of those patients are ultimately found to be suffering from heart attacks or other serious cardiac diseases, since issues from pneumonia to indigestion to anxiety can cause the same types of symptoms. But more than half of chest pain patients are admitted to the hospital for observation or traditional evaluation such as cardiac catheterization or a stress test.

“Until now, our methods for diagnosing patients with acute coronary syndromes in the emergency room setting have been both time-consuming and costly. We have spent the past 30 years trying to find a simple test that will tell patients, right now, ‘It’s not your heart.’ This trial is the first time we’ve been able to accomplish that,” said senior author Judd Hollander, MD, clinical research director in the department of Emergency Medicine and the senior author of the study.

The authors studied 1,370 patients at five medical centers who were classified as low-to-intermediate risk for heart attack, meaning they had no previously identified heart disease and did not have cardiac risk factors such as diabetes or high blood pressure. They randomized patients to one of two arms: those who received a CCTA scan and those who received conventional care strategies to rule out serious blockages of the arteries supplying the heart. Of 640 patients whose CCTA was negative, revealing no clinically important coronary artery blockages, none died or suffered a heart attack within 30 days. The investigators also found that patients in the CCTA arm were more than twice as likely to be discharged directly from the emergency department to their homes (50 percent) than those who underwent traditional care (23 percent). Patients in the CCTA arm also spent significantly less time in the hospital (a median of 18 hours), compared to those in the traditional care group (25 hours). Those with negative tests had an even greater difference in the length of their hospital stay (12 vs. 25 hours). Additionally, CCTA proved to be more effective at identifying patients with coronary artery disease compared to stress testing (9 percent of patients had a positive test versus 3.5 percent).

“CT scanning has long been used in emergency departments to learn the cause of other symptoms like abdominal pain and shortness of breath. It’s available in many hospitals around the clock, so now we can answer important questions about chest pain right away and send patients home much more quickly,” said lead author Harold Litt, MD, PhD, chief of Cardiovascular Imaging in the department of Radiology. “This test allows us to get a very good look at the coronary arteries in a noninvasive way, and for the large majority of people who are shown to not have a narrowing of the arteries, it’s an excellent alternative to cardiac catheterization.”

CCTA generates three-dimensional images of the heart and the blood vessels surrounding it. The tests, which are conducted like a standard CT scan, cost about $1,500 and allow patients who have a negative scan to be discharged from the hospital within hours, while costs for those admitted to the hospital for stress testing and monitoring typically total more than $4,000 for each patient.

On the Net:

Sony Patents Technology To Put Camera And Sensors Behind Smartphone Display

Jedidiah Becker for RedOrbit.com

Last week, the U.S. Patent and Trademark Office granted Sony the rights to a patent that will allow them to place front-facing cameras on smartphones behind rather than on top of the display screen. Not only will this allow smartphone makers to make display screens a little taller, it also opens the door for new technologies such as touch-screen fingerprint scans for heightening security on the increasingly versatile devices.

The patent request describes a “sensor-equipped display apparatus” made up of three different elements: several types of sensors placed behind a “light-transmissive display screen” and an unnamed material that will hide the sensors from the viewer.

The hidden sensors include a proximity sensor, a fingerprint sensor, an illuminance sensor and the camera sensor.

Essentially, the material of the screen itself will be able to allow light to pass through it in both directions. However, to ensure that users aren´t able to see the little sensors behind their screen, the interior of the display screen will be covered with a substance that blocks entering light from re-exiting — similar to those one-way mirrors used in interrogation rooms in the movies.

Although, since filing for the patent in May 2011, Sony has had plenty of time to contemplate what the first devices will be to utilize the technology, they have not yet announced when we can expect to see the high-tech screens hitting the market.

Sony´s patent will not be the first bit of technology to showcase a fingerprint lock. Last year, Motorola Mobility released its Atrix smartphone which also comes equipped with a biometric scanner on the backside of the device.

In Sony´s version of fingerprint-sensing technology, however, the technology is essentially streamlined and simplified.

According to the patent application, the technology would “allow even a user who is not familiar with the fingerprint authentication to readily execute an input manipulation for the fingerprint authentication.”

HEIGHTENED SECURITY AND BEYOND

What´s more, the design idea of placing cameras and sensors behind a one-way “light-transmissive” display screen promises to have ramifications for the whole industry and opens the door for an endless variety of new, related technologies.

For starters, the fingerprint-identification technology is likely to become a standard feature on all smartphones in the not-so-distant future, adding an extra level of security to devices that continue to play an ever larger role in our personal and professional lives.

Moreover, the eventual ubiquity of the technology is likely to serve as a significant theft deterrent, which currently remains a big problem for owners of the pricey little devices. As the devices´ security mechanisms become virtually uncrackable to anyone but the device owner, there becomes little incentive for anyone but the savviest tech-hackers to steal them.

Smartphone security features like Android´s sophisticated “pattern” and “memory” locks have already made it extremely difficult for non-owners to get into the devices. Much to the embarrassed chagrin of FBI officials, a California-branch of the federal agency had to enlist the help of Google earlier this month just to retrieve information from a criminal suspect´s smartphone. After numerous bungled attempts to get past the gadget´s lighter pattern lock, investigators set off the virtually impregnable memory lock and had to get a warrant from a judge forcing Google technicians to unlock the ironclad device.

But in a world where every user´s smartphone is accessible only with his or her fingerprint, the incentive to steal the devices could conceivably sink to almost nil.

Sony´s patent also alluded to the fact that the technology will likely lead to an immediately improved video-calling experience.

Without the little camera at the top, smartphone makers will be able to make the screen about a third- to a half-inch taller. And perhaps even more important, users will no longer have that awkward feeling that comes from trying to keep eye contact by looking into the camera while simultaneously watching the video of the person on screen. Because the camera and sensors will be directly behind the screen, users will more or less be able to look at the person on the other end square in the eyes, providing a more natural video-calling experience.

Still, it remains anyone´s guess when consumers can expect to see the technology in new smartphones.

It´s noteworthy that Apple Inc. filed much earlier for a similar patent in a behind-the-screen camera for its laptops. Yet four years after filing, the California-based tech wizards have for reasons unknown not yet introduced the technology to any of their products.

One conceivable explanation for the delay in coming to the market could concern the creation of a substance that allows for the one-way light-transmissive screen.

Sony has not identified what that substance is and likely has not yet produced it.

Tears During Coronary Angioplasty: Where Are They And How Do They Affect Patient Outcomes?

Researchers from Thomas Jefferson University Hospital discovered that blockages in the right coronary artery and those in bending areas of the coronary artery are the most common places for dissection, a tear in the artery that can occur during balloon angioplasty of the coronary arteries.

They will present their findings at the American College of Cardiology annual meeting in Chicago on Saturday, March 24 at 9 AM.

A ‘controlled tear’ is the mechanism by which angioplasty dilates the blocked vessels. A large tear, or spiral dissection, that continues almost entirely down the artery, however is associated with serious complications. When such a dissection occurs, the interior wall of the artery is torn, causing it to fold into the path of blood flow and sometimes block flow of blood in the artery altogether.

“This used to cause patients to be rushed to the operating room during angioplasty to open their chest and fix the blockage,” says Rajesh Pradhan, MD, cardiology fellow at Jefferson and first author on the study. Modern technology now allows for stents to be used to open the blockage and repair the torn artery in most cases.

“We wanted to look at these large tears that can dramatically affect blood flow to understand where they happen most and how good we are at fixing them for our patients,” says Pradhan.

The team retrospectively reviewed 24 cases of spiral dissection and matched them against a control group of patients without dissection.

Their analysis showed that the right coronary artery (RCA) was seven times more likely to be complicated by propagating dissection compared to other coronary arteries. Also, lesions (blockages) on a bend of 45 degrees or greater were 12 times more likely to develop a dissection compared to lesions that were not on a bend.

Stenting was successful in treating the dissection in 75 percent of patients. Major in-hospital adverse coronary events (stroke, heart attack, need for emergent bypass surgery, stent thrombosis or death) occurred in 54 percent of patients in the large dissection group and none in the control group. Adverse events in the dissection group included 11 heart attacks, need for emergent bypass surgery in four patients, and one stent thrombosis.

“Armed with this knowledge, we can more readily anticipate this complication and be better prepared when treating patients with lesions in these areas,” says Pradhan.

On the Net:

Gene Can Transform Mild Flu Into A Life-threatening Disease

An international team of researchers has discovered a human genetic flaw that could explain why influenza makes some people more sick than others.

Reporting in the journal Nature, British and American researchers, led by the Wellcome Trust Sanger Institute (WTSI) in the UK, said the variant of the IFITM3 gene was much more common in people hospitalized for the flu than in those who were able to fight the disease at home.

The researchers said this could explain why during the 2009/10 H1N1 “swine flu” pandemic most people had mild symptoms, while others got seriously ill and died. They said they scoured genetic databases covering thousands of people and found evidence that around one in 400 people may have the flawed genetic variant.

IFITM3 is an important protein that protects cells against infection and is believed to play a crucial role in the immune system´s response against viruses such as H1N1. When the protein is abundant in the body, the spread of the virus in the lungs is hindered, but if the protein is defective or absent, the virus can spread rapidly, causing greater illness and perhaps death.

Aaron Everitt, lead researcher from WTSI, said although the IFITM3 protein is known to play an important role in “limiting the spread of viruses in cells, little is known about how it works in lungs.” Our new research helps to explain “how both the gene and protein are linked to viral susceptibility,” he added.

The role of IFITM3 in humans was first suggested by studies using a genetic screen, which showed the protein blocked the growth of the flu and dengue in cells. This finding led the team to further investigate how the gene works in both humans and mice.

For the study, the researchers removed the gene from mice and found when they developed flu, their symptoms were much worse than the mice that still had the gene. They found that the loss of this single gene in mice can turn a mild case of influenza into a potentially fatal infection.

Armed with this knowledge, the researchers sequenced the IFITM3 genes of 53 patients hospitalized with the flu and found three (one in 18) have a genetic mutation of this gene, which is rare in normal people.

“Since IFITM3 appears to be a first line defender against infection, our efforts suggest that individuals and populations with less IFITM3 activity may be at increased risk during a pandemic and that IFITM3 could be vital for defending human populations against other viruses such as avian influenza virus and dengue virus,” said Dr. Abraham Brass, an Assistant Professor at the Ragon Institute and Gastrointestinal Unit of Massachusetts General Hospital, and co-lead author of the study.

The team said these findings need to be replicated in bigger studies before they can positively rule that the IFITM3 gene mutation is the key factor for causing serious illness.

Study co-author Professor Paul Kellam, from WTSI, said: “At the moment, if someone is in a more vulnerable group because of co-morbidity [another health problem], they would be offered the flu vaccine.” But, he said, having this variant would not make any difference to how people were treated.”

“Our research is important for people who have this variant as we predict their immune defenses could be weakened to some virus infections,” said Kellam. “Ultimately as we learn more about the genetics of susceptibility to viruses, then people can take informed precautions, such as vaccination to prevent infection.”

“This new discovery is the first clue from our detailed study of the devastating effects of flu in hospitalized patients,” Professor Peter Openshaw, director of the Center for Respiratory Infection at Imperial College London, told BBC News. “It vindicates our conviction that there is something unusual about these patients.”

Sir Mark Walport, director of the WTSI, said: “During the recent swine flu pandemic, many people found it remarkable that the same virus could provoke only mild symptoms in most people, while, more rarely, threatening the lives of others”

“This discovery points to a piece of the explanation: genetic variations affect the way in which different people respond to infection,” Walport said. “This important research adds to a growing scientific understanding that genetic factors affect the course of disease in more than one way. Genetic variations in a virus can increase its virulence, but genetic variations in that virus’s host – us – matter greatly as well.”

In the future, this genetic discovery could help doctors screen patients to identify if they will be at an increased risk of being brought down by flu, allowing them to be selected for priority vaccination or preventative treatment during outbreaks, the researchers wrote, adding it could also help to develop new vaccines or medicines that can stave off potentially more dangerous viruses such as the bird flu.

“Our efforts suggest that individuals and populations with less IFITM3 activity may be at increased risk during a pandemic, and that IFITM3 could be vital for defending human populations against other viruses such as avian influenza,” Brass told Reuters.

Samples for this study were obtained from the MOSAIC consortium in England and Scotland, coordinated from the Center for Respiratory Infection at Imperial College London, and the GenISIS consortium in Scotland at the Roslin Institute of the University of Edinburgh. These were critical for the human genetics component of the work.

Sleeping After Learning Reportedly Enhances Recall

New research from experts at the University of Notre Dame has discovered that going to sleep soon after learning new information is the best way for an individual to recall what he or she had just learned.

According to Asian News International (ANI) reports, Notre Dame psychologist Jessica Payne and her colleagues analyzed a total of 207 students who regularly slept for a minimum of six hours each night.

Each participant was randomly assigned to study declarative, semantically related, or unrelated word pairs at either 9:00am or 9:00pm, and was then tested on what they had learned 30 minutes, 12 hours, or 24 hours later.

“At the 12-hour retest, memory overall was superior following a night of sleep compared to a day of wakefulness. However, this performance difference was a result of a pronounced deterioration in memory for unrelated word pairs; there was no sleep-wake difference for related word pairs,” the university said in a Thursday press release.

“At the 24-hour retest, with all subjects having received both a full night of sleep and a full day of wakefulness, subjects’ memories were superior when sleep occurred shortly after learning, rather than following a full day of wakefulness,” they added, noting that declarative memory refers to the ability to consciously remember facts and events, and is broken down into semantic and episodic memory for each, respectively.

The results of Payne’s study were published in the March 22 edition of the journal PLoS One.

“Our study confirms that sleeping directly after learning something new is beneficial for memory. What’s novel about this study is that we tried to shine light on sleep’s influence on both types of declarative memory by studying semantically unrelated and related word pairs,” Payne said in a statement.

“Since we found that sleeping soon after learning benefited both types of memory, this means that it would be a good thing to rehearse any information you need to remember just prior to going to bed,” she added. “In some sense, you may be ‘telling’ the sleeping brain what to consolidate.”

Study Discovers How Massive Black Holes Grow So Rapidly

Black holes that grow to masses billions of times greater than that of our sun most likely reached those sizes by consuming objects from multiple locations at the same time, researchers from the UK and Australia claim in a new study set for publication in the journal Monthly Notices of the Royal Astronomical Society.

In a Friday press release, officials with the University of Leicester, who conducted the study along with colleagues from Australia’s Monash University, compared the process to that of eating a meal — only in this case, the black holes “have no ℠table manners´, and tip their ℠food´ directly into their mouths, eating more than one course simultaneously.”

Speaking in more technical terms, these black holes consume matter located as much as a few light years away — objects resting in a so-called “danger zone” known as the Schwarzschild radius, Ted Thornhill of the Daily Mail wrote last week.

While it was believed that the black hole was surrounded by one gas-and-dust ring, known as an accretion disc and formed from nearby planets and stars pulled in by the black hole’s suction forces, the researchers in this new study created a model that suggested that there may, in fact, be more than one accretion disc.

“The researchers did computer simulations of two gas discs orbiting a black hole at different angles and found after a short time the discs spread and collide, and large amounts of gas fall into the hole,” UPI reporters wrote on March 23, adding that lead researcher Andrew King of the University of Leicester and his associates reported that their calculations showed that black holes can grow up to a thousand times faster when this phenomenon occurs.

King’s research, which was funded by the UK Science and Technology Facilities Council (STFC), looked at some of the largest black holes in the universe, including ones about “a thousand times heavier” than the one at the center of the Milky Way, which itself is approximately four million times heavier than the sun.

“We know they grew very quickly after the Big Bang,” King, a professor from the university’s Department of Physics and Astronomy, said in a statement. “These hugely massive black holes were already full — grown when the universe was very young, less than a tenth of its present age.”

“We needed a faster mechanism,” added Chris Nixon, also of the University of Leicester, “so we wondered what would happen if gas came in from different directions.”

Thus, King, Nixon, and Australian associate Daniel Price created their computer simulation, learning that if two gas discs orbit a black hole at different angles, they eventually spread and collide, dumping massive amounts of gas into the hole. King said it was comparable to the loss of centrifugal force that occurs when two motorcycle riders collide while riding on a “Wall of Death.”

“We don’t know exactly how gas flows inside galaxies in the early universe, but I think it is very promising that if the flows are chaotic it is very easy for the black hole to feed,” he added.

Image Caption: English: This artist´s impression depicts the newly discovered stellar-mass black hole in the spiral galaxy NGC 300. The black hole has a mass of about twenty times the mass of the Sun and is associated with a Wolf—Rayet star; a star that will become a black hole itself. Thanks to the observations performed with the FORS2 instrument mounted on ESO´s Very Large Telescope, astronomers have confirmed an earlier hunch that the black hole and the Wolf—Rayet star dance around each other in a diabolic waltz, with a period of about 32 hours. The astronomers also found that the black hole is stripping matter away from the star as they orbit each other. How such a tightly bound system has survived the tumultuous phases that preceded the formation of the black hole is still a mystery. Credits: ESO/L. Calçada

Early Exposure to Germs Could Help Build Immunity

Childhood exposure to bacteria and other germs may help build immunity to various microbes later on in life, researchers from Brigham and Women’s Hospital (BWH) claim in a new study.
According to Carrie Gann of ABC News, this belief is known as the “hygiene hypothesis,” and suggests — in contrast to the common belief that people should strive to remain germ free regardless of circumstances — that bacteria and other germs may be “a necessary part of a healthy immune system, helping our body’s defenses beef up and fight future illnesses. When a person’s exposure to germs is decreased, problems may arise.”
In a press release detailing their findings, the BWH experts say that the hygiene hypothesis helps to explain the increase of allergic reactions and auto-immune diseases in cities throughout the world, and that medical professionals have claimed that various sociological and environmental changes, such as the use of antibiotics among younger patients, have contributed to this phenomenon.
However, no scientific study had ever discovered a biological basis for this belief. They say that their study, which was published Thursday in Science Express, changes that.
“The researchers show that in mice, exposure to microbes in early life can reduce the body´s inventory of invariant natural killer T (iNKT) cells, which help to fight infection but can also turn on the body, causing a range of disorders such as asthma or inflammatory bowel disease,” Nature‘s Helen Thompson reported on March 22.
The BWH researchers report that, after studying the immune systems of both “germ-free mice” and those who have received normal exposure to bacteria and other microbes, they discovered that the germ-free mice “had exaggerated inflammation of the lungs and colon resembling asthma and colitis, respectively.”
“Most importantly, the researchers discovered that exposing the germ-free mice to microbes during their first weeks of life, but not when exposed later in adult life, led to a normalized immune system and prevention of diseases,” they added. “Moreover, the protection provided by early-life exposure to microbes was long-lasting, as predicted by the hygiene hypothesis.”
The researchers warn that additional research is required to see whether or not the hypothesis holds true for humans as well, but according to Gann, experts claim that the biological mechanism analyzed in the mice during this study is similar in people.
Likewise, Erika Von Mutius, head of the Munich University Children’s Hospital Asthma and Allergy Department, told Nature that the findings “complement what we see in epidemiology“¦ It supports the idea that the microbiome is very important and the age of exposure is decisive.”

The King Of Wasps And Scorpio Rising

Brett Smith for RedOrbit.com

Discovery of a new giant wasp species last year has led to the discovery of an even larger species than had been sitting in a collection for 80 years.

Megalara garuda hails from the Indonesian island of Sulawesi, the same island where a team led by professor Lynn Kimsey (UC Davis) discovered the wasp´s slightly smaller relative Dalara garuda.

Both species belong to the digger wasp family, a diverse group of wasps that sting and paralyze prey insects. These paralyzed insects are then placed in a protected nest where they often remain alive until eaten by hatched digger wasp larvae.

What separates the two newly discovered wasps from the rest of the digger wasp family is the unusual body shape and extremely large jaws. The large jaws are sickle-shaped and longer than the wasp´s front legs.

The large jaws probably play a role not only in hunting prey and defense but also reproduction, according to Kimsey.

“In another species in the genus the males hang out in the nest entrance,” Kimsey said. “This serves to protect the nest from parasites and nest robbing, and for this he exacts payment from the female by mating with her every time she returns to the nest. So it’s a way of guaranteeing paternity. Additionally, the jaws are big enough to wrap around the female´s thorax and hold her during mating.”

In what appears to be some kind of taxonomic cold case, Kimsey´s expedition to Sulawesi, along with a research team from the UC Davis, last year led to the discovery of the initial giant wasp, Dalara garuda. The island is renowned for its biodiversity, its rainforest and its proximity to the equator. Kimsey is part of a $4 million grant program awarded to UC Davis scientists in 2008 to study the biodiversity on Sulawesi, which is being threatened by logging and development.

Another species was recently discovered in California´s Death Valley National Park.  Using ultraviolet light that causes scorpions to glow a fluorescent green, a group led by PhD candidate Matthew Graham of the University of Nevada, Las Vegas (UNLV) located a new species of scorpion. They named the species Wernerius inyoensis, after the Inyo Mountains where it was found.

The new species, which measures about 16mm, is identified by the presence of a conspicuous spine at the base of the stinger. Previously known species of this type of scorpion are rarely observed in the wild. Researchers speculate that either these scorpions occur at very low densities or have limited surface activity hinting at the possibility that these scorpions are subterrestrial. In this instance, the scorpions would spend the majority of their time deep in rock crevices or among piles of rock.

Scorpions, having eight legs, are related to arachnids and although the most biodiversity of scorpion populations are found in subtropical climates, they can be found in almost any habitat. There are approximately 1400 know species of scorpions, about 25 of those are lethal to humans.

The new species joins the genus Wernerius, which can be found primarily in southwestern region of North America. The various types of scorpions comprise the majority of biodiversity in this arid, desert section of the United States.

References:

(1) Kimsey LS, Ohl M (2012) Megalara garuda, a new genus and species of larrine wasps from Indonesia (Larrinae, Crabronidae, Hymenoptera). ZooKeys 177: 49-57. doi: 10.3897/zookeys.177.2475

(2) Webber MM, Graham MR, Jaeger JR (2012) Wernerius inyoensis, an elusive new scorpion from the Inyo Mountains of California (Scorpiones, Vaejovidae). ZooKeys 177: 1-13. doi: 10.3897/zookeys.177.2562

Image 1: This is a close view of the enormous jaws of the male wasps. Credit: Dr. Lynn Kimsey, Dr. Michael Ohl

Image 2: This is a Megalara garuda, male. It is also known as the King of Wasps. Credit: Dr. Lynn Kimsey, Dr. Michael Ohl

Image 3: This is a scorpion glowing under ultraviolet light. This specimen is a Northern Scorpion, a broadly distributed species that was also found in the Inyo Mountains. Credit: Michael Webber

Stroke Progress Review Group Sets Priorities For NIH Research And Shapes Future Programs And Policies

Kessler Foundation stroke expert co-chairs review of rehabilitation and recovery; working group sets priorities for future research

In 2011, the National Institute of Neurological Disorders and Stroke (NINDS) convened the Stroke Progress Review Group (SPRG) to conduct a final 10-year review of the state of stroke research. The goal is to set priorities and shape future NINDS programs and policies.  While SPRG found much available data for maximizing stroke rehabilitation effects, translation to clinical practice is inadequate.  To realize the enormous potential for improving rehabilitation and recovery, more resources should be applied to implementing and directly supporting SPRG´s recommendations. The Final Report of the Stroke PRG is on the NINDS SPRG website:  http://www.ninds.nih.gov/find_people/groups/stroke_prg/01-2012-stroke-prg-report.htm

The working group for rehabilitation and recovery was co-chaired by Anna Barrett, MD, director of Stroke Rehabilitation Research at Kessler Foundation and Pamela Duncan, PT, PhD, Duke Center for Clinical Health Policy Research, with Steven C. Cramer, MD (NINDS liaison co-chair).   “The strategic plan and vision set out in the 2002 SPRG was intended for ten-year implementation,” said Dr. Barrett. “To assess progress in rehabilitation and recovery, we recruited eleven working group members (John Chae, Leonardo Cohen, Bruce Crosson, Leigh Hochberg, Rebecca Ichord, Albert Lo, Randy Nudo, Randall Robey, R. Jarrett Rushmore, Sean Savitz, and Robert Teasell with assistance from Norine Foley).”

The working group found significant advances at ten-year followup. “Not only have we addressed the original SPRG priorities (eg, improving stroke deficits, rather than advising compensatory management), noted Dr Barrett, “we have pushed the science of rehabilitation much further forward.  For example, the report cites NIH-funded work done at Kessler Foundation using optical prism training to rehabilitate hidden disabilities of functional vision after right brain stroke.  This concept of targeting any treatment to a specific brain system had not yet been funded by the NIH ten years ago. Now we need to apply these strategies over large patient groups, since the number of US stroke survivors continues to rise.”

Three priorities were identified: 1. Need for studies identifying valid, reliable, affordable, and accessible measurements of neuroplasticity. We need to understand how these measures of brain plasticity can be used to guide and individualize rehabilitation/restorative therapies to achieve optimal outcomes among all persons affected by stroke. 2. Substantial data suggest that brain plasticity after stroke is shaped by experience. We need to determine which experiences are most important, what dose of experience is needed to maximize outcomes, and how to measure these experiences. An improved understanding of biomarkers of recovery and restorative therapies will support achieving these goals. 3. Advances in basic science of brain repair indicate a major opportunity for translating new restorative therapies to address post-stroke disability. Delivery of appropriate treatment requires a team effort, from bench to bedside to health policy reform. Implementation of Specialized Programs of Translational Stroke Research in Recovery (SPOTS-R2) is a priority.

“This report and the top 3 priorities will form a crucial component of the second phase of our stroke planning process where we will identify the highest priority research goals in each of the major areas of stroke prevention, treatment and recovery,” commented NINDS director Story Landis, PhD.

On the Net:

Can Our Genes Be Making Us Fat?

While high-fat foods are thought to be of universal appeal, there is actually a lot of variation in the extent to which people like and consume fat. A new study in the March issue of the Journal of Food Science, published by the Institute of Food Technologists, reported that two specific genes (TAS2R38—a bitter taste receptor and CD36—a possible fat receptor), may play a role in some people’s ability to taste and enjoy dietary fat. By understanding the role of these two genes, food scientists may be able to help people who have trouble controlling how much fat they eat.
Most food scientists acknowledge the texture of fat plays a big role in how fat is perceived in the mouth. For example, ice cream is typically “rich, smooth and creamy.” And certain fats, scientists have determined, can be detected by smell. Only recently have food scientists explored that most fats have a taste too. Researchers are now investigating the gene (CD360) that is responsible for detecting the taste of fats (fatty acids) in the mouth.
In the recent Journal of Food Science study, investigators focused on one ethnic group to limit genetic variation that could reduce the ability to detect associations with the gene of interest. They determined the fat preferences and CD36 status of more than 300 African-American adults. The investigators from the New York Obesity Research Center identified a genetic variant present in 21 percent of the African-Americans that was associated with higher preferences for added fats and oils (e.g. salad dressings, cooking oils, etc). They also found study participants with this genetic variance ranked Italian salad dressings creamier than those who have other genotypes.
The other gene explored by these investigators, TAS2R38, is the receptor for bitter taste compounds. About 70 percent of U.S. adults and children are “tasters” of these compounds, while the remaining 30 percent are “nontasters.” Results indicate that nontasters of these compounds tend to be poor at discriminating fat in foods; therefore individuals who can’t detect fat’s presence may consume higher fat foods to compensate. This is in part due to the fact that nontasters have fewer taste buds than tasters. While researchers recognize that the cause of obesity is multifaceted, they continue to examine the role of these genotypes in weight management.
Genetic testing within the food industry may not be too far off. Once studies like these are more fully developed, there may be a role for genotyping study participants when it comes to testing a new product. For example, a company wanting to test out a dressing may include people with different genes relating to fat perception in order to get a more accurate opinion. In addition, the food industry will be able to create different kinds of foods for certain populations.

On the Net:

Living Alone Can Lead To Depression

Over the past two decades, the number of people living on their own in the US and UK has doubled. Based on research published in BioMed Central´s journal BMC Public Health, people who live alone are almost 80% more likely to be depressed compared to those who live with others.
The research analyzed the number of people who are currently taking antidepressants to arrive at their conclusion.
The reasons for living alone were different between men and women. A third of the risk of living alone for women was attributed to sociodemographic factors, such as lack of education and low income. Men, on the other hand, were more at risk for factors such as poor job climate and heavy drinking.
While previous research suggests the elderly and single parents face mental health risks when living alone, not much research has been done concerning the effects of living alone on young, work-a-day individuals.
A research team from Finland set out to study this working-age group of people and determine the effects of living alone.
To conduct their research, the team followed 3,500 working-age men and women for seven years. The team compared their living arrangements with several risk factors, such as psychosocial, sociodemographic, and health risks, like smoking and heavy drinking. The team than analyzed this data against data from those who take antidepressants. Antidepressant information came from the National Prescription Register.
“Our study shows that people living alone have an increased risk of developing depression,” said Dr. Laura Pilkki-RÃ¥back, who conducted the research at the Finnish Institute of Occupational Health. “Overall there was no difference in the increased risk of depression by living alone for either men or women. Poor housing conditions (especially for women) and a lack of social support (particularly for men) were the main contributory factors to this increased risk.”
Researchers suggest one of the main reasons causing these younger people to move on their own is an issue of “social capital”. While social capital has no formal and agreed-upon definition, infed.org suggests that social capital is “The central idea is that ‘social networks are a valuable asset´. “A sense of belonging and the concrete experience of social networks (and the relationships of trust and tolerance that can be involved) can, it is argued, bring great benefits to people.”
Without this social capital, the younger, working generation is more likely to strike out on their own. Living without this kind of social network and structure is proven to have adverse effects on mental health.
While the researchers were able to prove that living alone was a common factor in those who began to take antidepressants, they were not able to confirm if those who were already taking antidepressants were more likely to move out on their own. Furthermore, as people only seek professional care if their depression symptoms are severe enough, those experiencing mild depression who have not sought out professional care were not able to be analyzed. As it stands, one thing is certain: It is best to live with someone if you can rather than live alone.

Noise Pollution Has Effect On Plants, Study Finds

A new study published in the Proceedings of the Royal Society B has found that human noise like traffic can have ripple effects on plants.

Lead author Clinton Francis of the National Evolutionary Synthesis Center (NESCent) in Durham, North Carolina, said the consequences of noise could last for decades, even after the source of the noise goes away.

Previous studies found that some animals increase in numbers near noisy sites, while others decline, but the results of the new study found that plants suffer from the ripple effect of it.

Many plants rely on birds and other animals to deliver pollen from one flower to the next, or disperse their seeds.

The team conducted a series of experiments from 2007 to 2010 in Bureau of Land Management’s Rattlesnake Canyon Wildlife Area in northwestern New Mexico to look into the ripple effect further.

They first did an experiment using patches of artificial plants designed to mimic a common red wildflower in an area called scarlet gilia.  They dusted the flowers of one plant per patch with artificial pollen, using a different color for each patch.

The researchers found that the black-chinned hummingbird made five times more visits to noisy sites than quiet ones.

“Black-chinned hummingbirds may prefer noisy sites because another bird species that preys on their nestlings, the western scrub jay, tends to avoid those areas,” Francis said in a press release.

They determined that “hummingbird-pollinated plants such as scarlet gilia may indirectly benefit from noise,” Francis said.

In a second series of experiments, they set out to find out what noise might mean for tree seeds and seedlings, specifically the piñon pine.

To find out if noise affected the number of piñon pine seeds that animals ate, the researchers scattered seeds underneath 120 of the trees in noisy and quiet sites.

Two animals in particular differed between quiet and noisy sites: mice, which preferred noisy sites, and western scrub jays, which avoided noisy areas.

Francis said mice are not a good candidate to spread piñon pine seeds because the seeds do not survive the passage through an animal’s gut.  So a boost in mice populations could be bad for pine seedlings.

However, western scrub jays may take hundreds to thousands of seeds, hiding them in soil to eat later in the year.  The seeds they fail to relocate eventually germinate, leading to more piñon pines in quieter areas.

“Fewer seedlings in noisy areas might eventually mean fewer mature trees, but because piñon pines are so slow-growing the shift could have gone undetected for years,” Francis said in the press release.

“Fewer piñon pine trees would mean less critical habitat for the hundreds of species that depend on them for survival.”

Image Caption: Human noise affects plants such as piñon pine, whose seed-dispersers avoid the clamor. Credit: Clinton Francis

Monarch Butterfly Population Continues To Decline

A Texas A&M researcher has found evidence that the population of Monarch butterflies continues to shrink.

Craig Wilson is a senior research associate in the Center for Mathematics and Science Education and a longtime butterfly enthusiast. He says, according to reports from the World Wildlife Fund, Mexico´s Michoacan State and a host of private donors, that the numbers of Monarch butterflies that cross the state of Texas will be dramatically reduced, by as much as 30%.

These numbers are part of a sad decades-trend in declining numbers of Monarch butterflies. This long and downward decline has Wilson concerned, saying it would be best “that we take the long view rather than yearly cycles.”

In a press release, Wilson stated: “The latest information shows that Monarchs will be down from 25 to 30 percent this year, and that has been part of a disturbing trend the last few years.”

“Last year´s severe drought and fires in the region no doubt played a part, resulting in less nectar for the Monarchs as they migrated south. But estimates show that each year, millions of acres of land are being lost that would support Monarchs, either by farmers converting dormant land for crop use — mainly to herbicide tolerant corn and soybeans — or the overuse of herbicides and mowing. Milkweed is the key plant because it´s the only plant where the female will lay her eggs.”

Researchers have been keeping a close eye on the extreme land conditions in Texas and the effects these conditions have on the Monarch butterfly population.

“Chip Taylor, who is the director of Monarch Watch at the University of Kansas, estimates that 100 million acres of land have already been lost that previously supported Monarchs,” Wilson notes.

The majority of the Monarch reserves are located in Michoacan, Mexico, according to Wilson. These butterflies will spend the winter months in this area and mate before making their grand migration north. When the spring arrives, the Monarchs leave Mexico and fly through Texas, laying their eggs in the milkweed plants and feeding on nectar. Adult Monarchs will take various routes through Texas as they make their way north. The offspring of these butterflies will travel much further north, often as far as Canada.

According to Wilson´s research and data from Texas Monarch Watch, there will be fewer butterflies to make their journey across the Lone Star State. Last year, Monarch butterfly breeding grounds covered 9.9 acres of forest in the Michoacan State of Mexico. This year, that number was down to 7.14 acres, reinforcing a downward trend that has been occurring since official population surveys began in 1994.

These disturbing numbers have Wilson calling for a national effort to save the Monarchs before their numbers dwindle even further.

“We need a national priority of planting milkweed to assure there will be Monarchs in the future,” he says. “If we could get several states to collaborate, we might be able to promote a program where the north-south interstates were planted with milkweed, such as Lady Bird Johnson´s program to plant native seeds along Texas highways 35-40 years ago. This would provide a ℠feeding´ corridor right up to Canada for the Monarchs.”

It’s Official: Warm Weather Causes Flowers To Bloom

Scientists have completed research in attempts to explain what  we see in our parks and gardens every year: with the onset of warmer weather comes blooming flowers and trees.

Funded by the Biotechnology and Biological Sciences Research Council (BBSRC), scientists from the John Innes Center have investigated the effects of warm weather and global climate changes on flowering plants and trees. Their findings on this research will be published soon in the journal Nature.

The research identified a control gene, called PIF4, whose job it is to activate a flowering pathway, causing the flowers to bloom. When the temperature gets below a certain point, the gene is unable to act.

“What is striking is that temperature alone is able to exert such specific and precise control on the activity of PIF4,” said Dr Phil Wigge, the lead scientist in the study.

This is not the first time scientists have looked at PIF4 in plants. Previously, the gene had been shown to control other plant responses to warmth, particularly growth. With this new research, the scientists have found this same gene is also responsible for activating flowering when the temperatures turn warm.

When plants flower, a special molecule called Florigen is activated. Florigen can be activated by many signals, like the longer days that accompany spring. While some plants rely more heavily on temperature to spark their flowering and leaf emergence, others depend on longer days to start their new life cycles.

The affects of daylight on Florigen has already been documented. This study is the first of its kind to understand how temperature instead of daylight acts as a Florigen activator.

Plants can still flower when the temperatures are cooler, albeit through other pathways. If PIF4 is activated too late, it will not bind to the flowering molecule Florigen and therefore will not accelerate flowering. When the temperatures rise, PIF4 will bind to Florigen and plants will flower more quickly via the PIF4 pathways.

“Our findings explain at the molecular level what we observe in our gardens as the warmer temperatures of spring arrive,” said Wigge.

“It also explains why plants are flowering earlier as a result of climate change.”

By unlocking these mysteries and understanding why plants flower and bloom when they do, Wigge and his colleagues hope to develop crops that will be resistant to climate changes and fluctuations. When crops react strongly to warmer temperatures, their yield is reduced. By understanding at the molecular level how these plants react to temperature and when they begin to bloom and mature, the team hopes they can breed a heartier and more resilient crop.

According to NOAA´s 2008 State of the Climate Report, Wigge´s research may be coming at just the right time. The report shows the Earth is warming at a rate of .29 degrees Fahrenheit per decade and average surface temperature has warmed about 1 degree Fahrenheit since 1970. In fact, the eight warmest years on record have occurred in the last 11 years, with the title of warmest year going to 2005.

“Knowing the key players in the temperature response pathways will be a valuable tool for safeguarding food security in an era of climate change,” said Wigge.