New York City Raises Legal Smoking Age To 21

Michael Harper for redOrbit.com – Your Universe Online

The New York City Council has raised the legal smoking age from 18 to 21. Adults must now be of legal drinking age before they can buy cigarettes, electronic cigarettes or any tobacco products. New York City is the largest city to increase the smoking age from 18 to 21. In an additional measure, the city also increased the minimum cost of a pack of cigarettes to $10.50 and banned the discounting of tobacco products.

This is the latest of several attempts by Mayor Michael Bloomberg to put tight restrictions on smoking in the city. Previously he has worked to kick cigarettes out of bars and restaurants and even raise tobacco taxes on packs of cigarettes and other smokeables sold in the city. In an interesting twist, Bloomberg’s measure did not cover the sale of bongs or pipes used to smoke marijuana.

“By increasing the smoking age to 21 we will help prevent another generation from the ill health and shorter life expectancy that comes with smoking,” said Bloomberg“It’s critical that we stop young people from smoking before they ever start.”

Bloomberg now has 30 days to sign the bill into law. After which, the law will go into effect after 180 days.

Though Bloomberg’s office praises the prior attempts to squash smoking in the city, many people – smokers and non-smokers alike – view these efforts as yet another step taken towards a Nanny State. Bloomberg’s previous attempt to ban large soda containers was quite controversial and even brought non-New Yorkers into the fray to oppose the measure.

According to City Hall, the number of smokers in New York fell from 21.5 percent to 14.8 percent between 2002 and 2011. Bloomberg’s office is expecting similar results from this measure, saying they expect the number of 18-20 year-old smokers to drop by more than half. Their previous measures have already worked to reduce the number of high school smokers by 8.5 percent between 2001 and 2011.

Those who disagree with the new measure – most notably cigarette manufacturers – say it will only drive young adults to find their smokes on the black market and create even more problems for the city.

“New York City already has the highest cigarette tax rate and the highest cigarette smuggling rate in the country,” said Bryan D. Hatchell , a spokesman for R.J. Reynolds Tobacco Company in a statement to ABC News. “Those go hand in hand and this new law will only make the problem worse.”

As noted by the New York Post, there’s some discrepancy in the legal tobacco smoking age and at what age young adults can buy marijuana paraphernalia. Though it’s illegal to sell pipes, water bongs and other smoking accessories that can be used with marijuana, shops are able to sell these goods if they claim they’re for tobacco use only. This means an 18 year-old New Yorker is now able to legally buy a pipe to smoke marijuana but cannot buy a cigar.

Bloomberg’s proposed ban on sugary sodas was also met with some criticism and was ultimately struck down by New York’s Supreme Court. Justice Milton Tingling said the measure was “fraught with arbitrary and capricious consequences.”

Brain Wiring Lets Babies Learn Through Imitation

Brett Smith for redOrbit.com – Your Universe Online

Since we weren’t able to talk during the first couple years of our lives, we had to learn how to make simple movements by watching our parents and imitating them. New research from Temple University and the University of Washington published in the journal PLOS ONE has revealed some of the neural mechanisms behind how babies learn through imitation.

“Babies are exquisitely careful people-watchers, and they’re primed to learn from others,” said study author Andrew Meltzoff, co-director of the UW Institute for Learning & Brain Sciences. “And now we see that when babies watch someone else, it activates their own brains. This study is a first step in understanding the neuroscience of how babies learn through imitation.”

In the study, the researchers found that babies’ brains showed specific activity patterns when an adult touched a toy with different parts of her body. When the 14-month-old babies watched a researcher’s hand touch a toy, the hand region of the child’s brain became active. When another group of infants saw a researcher touch the same toy with her foot, the foot area of the child’s brain lit up.

Because each body part has an identifiable section of neural ‘real estate’ in the brain, the scientists were able to clearly correlate brain activity to specific limbs. Previous research has found that a corresponding part of the adult brain activates while watching someone else use a specific body part, and the study team wondered if the same would be true for babies.

For the study, 70 infants were outfitted with electroencephalogram (EEG) caps, which have embedded sensors that detect brain activity in the various motor regions of the brain. While seated on a parent’s lap, each child observed an experimenter touching a toy on a low table. When the researcher pressed the toy’s clear plastic dome with her hand or foot, music sounded and confetti in the dome spun around. The researcher repeated the actions until the baby lost interest.

“Our findings show that when babies see others produce actions with a particular body part, their brains are activated in a corresponding way,” said Joni Saby, a psychology graduate student at Temple University in Philadelphia. “This mapping may facilitate imitation and could play a role in the baby’s ability to then produce the same actions themselves.”

To copy what they see adults do, babies must first know which body part is needed to replicate an observed behavior. The new study indicated that babies’ brains are organized in a way that helps crack that code.

“The reason this is exciting is that it gives insight into a crucial aspect of imitation,” said study author Peter Marshall, an associate psychology professor at Temple University. “To imitate the action of another person, babies first need to register what body part the other person used. Our findings suggest that babies do this in a particular way by mapping the actions of the other person onto their own body.”

“The neural system of babies directly connects them to other people, which jump-starts imitation and social-emotional connectedness and bonding,” Meltzoff added. “Babies look at you and see themselves.”

Extremely Rare Hybrid Solar Eclipse To Occur On Sunday

[ Watch the Video: Sunday Solar Eclipse Will Be A Rare Hybrid ]
redOrbit Staff & Wire Reports – Your Universe Online
Parts of eastern North America, northern South America, southern Europe, the Middle East and several other parts of the world will be able to experience a unique type of solar eclipse this Sunday, November 3.
The event is known as a hybrid solar eclipse, and according to Deborah Byrd and Bruce McClure of EarthSky, this type of event “appears fleetingly as an annular – or ring eclipse – at its start and becomes a brief total eclipse later on.” However, many parts of the world will see a partial eclipse sometime between sunrise and sunset.
Byrd and McClure report that the eclipse will be visible to those living in far-eastern North America, the Caribbean, northern South America, southern Greenland, the Atlantic Ocean, southern Europe, Africa, Madagascar and the Middle East. Proper protection will be necessary when observing the event to avoid potential injury or blindness.
Provided skies are clear enough, a partial solar eclipse will be visible in eastern North America beginning at sunrise on Sunday. From that location, as well as the Caribbean and the northwestern tip of South America, the eclipse will appear as an extremely shallow and shrinking partial solar eclipse, the EarthSky writers said.
“Seen from further west in the US and Canada, the sun will rise with the eclipse nearly over. Assuming you have a flat horizon and good sky conditions, the western limit of the event’s visibility runs through southern Ontario, Ohio, Kentucky, Tennessee, Alabama, and the Florida Panhandle,” they added.
Those living west of those areas will not be able to catch a glimpse of the event, Byrd and McClure said. The partial eclipse will be visible until 7:12am EST Sunday morning in Montreal; 7:11am EST in New York; 7:08am EST in Raleigh, North Carolina; 7:02am EST in Miami; 7:00am EST in Havana, Cuba; 6:52am local time in Cartagena, Columbia. It will be visible during the afternoon hours in Europe, Africa and the Middle East.
“Equatorial Africa views short-lived total solar eclipse in afternoon hours November 3,” the reporters said. “Well over 99.9 percent of the eclipse viewing area will see varying degrees of a partial solar eclipse. On land, a total solar eclipse will be visible along a very narrow track in equatorial Africa (Gabon, Congo, Democratic Republic of the Congo, Uganda, Kenya, Ethiopia and Somalia) sometime during the afternoon hours on Sunday, November 3. At best, the total eclipse will last somewhat more than one minute (in western Gabon).”
This will be the fifth eclipse and the second solar eclipse of 2013, according to Sean Breslin of The Weather Channel. So how rare of an event is this? Out of the approximately 12,000 solar eclipses that have been recorded since 1999 BC, less than five percent have been hybrid eclipses, he added.

In Pursuit Of The Future Internet

Enid Burns for redOrbit.com – Your Universe Online

The Internet continues to evolve, and engineers are working on new infrastructure that could facilitate better service. A new European Union-funded project called “Pursuit” is working on new protocols that will allow content to be accessed on a peer-to-peer basis, rather than the need to access a server.

The “Pursuit” project is being developed as a proof-of-concept model for overhauling the existing structure of the Internet Protocol (IP) layer. Currently the structure requires isolated networks to be connected, which the researchers call “internetworked.”

Pursuit Internet, as engineers at the University of Cambridge are calling it, will enable more socially minded and intelligent system interactions. The new system will allow users to obtain information without needing direct access to servers where the content is initially stored. By sharing information from one user’s computer to another, researchers argue that the user has more control of information. One example, provided in an animated video, is of a patient wearing a heart rate monitor that must be connected to the Internet for the doctor to access data. Using Pursuit Internet, the patient can allow access to just the doctor and emergency care workers. It is explained that such parameters are not possible currently, where the patient’s information and privacy settings are stored on a server instead of the patient’s computer.

The new system allows full or partial access to content. “Individual computers would be able to copy and republish content on receipt, providing other users with the option to access data, or fragments of data, from a wide range of locations rather than the source itself. Essentially, the model would enable all online content to be shared in a manner emulating the ‘peer-to-peer’ approach taken by some file-sharing sites, but on an unprecedented internet-wide scale,” a document on the research said.

The engineers behind Pursuit Internet believe such an infrastructure would make the Internet faster, more efficient, and more capable of withstanding rapidly escalating levels of global user demand. Information delivery would also not be prone to server crashes.

While a peer-to-peer model might allow faster access to information published on the Internet, there are limitations. If data is stored on a computer – whether it is medical information, a website or other content – it is subject to that person’s computer being on and operational.

Engineers working on the prototype argue that Pursuit Internet focuses on information rather than web addresses or URLs to access content, which makes digital content more secure. “They envisage that by making individual bits of data recognizable, that data could be ‘fingerprinted’ to show that it comes from an authorized source,” the report said.

Senior researcher at the University of Cambridge Computer Lab Dr. Dirk Trossen serves as technical manager for Pursuit. “The current internet architecture is based on the idea that one computer calls another, with packets of information moving between them, from end to end. As users, however, we aren’t interested in the storage location or connecting the endpoints. What we want is the stuff that lives there,” he said.

“Our system focuses on the way in which society itself uses the Internet to get hold of that content. It puts information first. One colleague asked me how, using this architecture, you would get to the server. The answer is: you don’t. The only reason we care about web addresses and servers now is because the people who designed the network tell us that we need to. What we are really after is content and information,” Dr. Trossen continued.

The Pursuit team won the Future Internet Assembly (FIA) award in May of this year after demonstrating applications that will benefit from Pursuit Internet such as searching for and retrieving information online.

Pursuit Internet removes the need for a website URL, or Uniform Resource Locator. It instead replaces the URL with a URI, or Uniform Resource Identifiers. These URIs would be highly specific identifiers which enable the system to work out what the information or content is. If enacted, Pursuit Internet will change the way in which information is routed and forwarded online.

“Under our system, if someone near you had already watched that video or show, then in the course of getting it their computer or platform would republish the content,” Trossen said. “That would enable you to get the content from their network, as well as from the original server.”

“Widely used content that millions of people want would end up being widely diffused across the network. Everyone who has republished the content could give you some, or all of it. So essentially we are taking dedicated servers out of the equation,” Trossen continued.

The pursuit of Pursuit Internet might turn the Internet into a large peer-to-peer network.

“With a system like the one we are proposing, the whole system becomes sustainable,” Trossen said. “The need to do something like this is only going to become more pressing as we record and upload more information.”

Eating At A Dinner Table Linked To Lower BMI

Brett Smith for redOrbit.com – Your Universe Online

Every night, millions of American families sit down to eat dinner – where they do it could be an indicator of their physical health. According to a new study published in the journal Obesity, adults and children who sit down together to eat supper as a family tend to have a lower body mass index (BMI) than those who don’t.

In the study, two researchers from Cornell University looked at the relationship between daily family dinner habits and the BMI of 190 parents and nearly 150 children. Other studies have shown that lifestyle factors like physical activity and income are associated with BMI.

Adult participants in the study completed a survey on their entire family’s mealtime habits. They answered a wide range of questions about mealtime activities, such as talking about someone’s work or school day, during a normal week. After the surveys were completed, researchers calculated participants’ BMI, which is based on height and weight.

The Cornell researchers then correlated the BMIs with mealtime habits to see if any patterns emerged. They found that the higher the BMI of parents, the more likely they were to eat with the TV on. Eating at either a dining room or kitchen table in the home was associated with lower BMIs for both children and adults.

The researchers also found that girls who assisted their parents with dinner prep tended to have a higher BMI, but no such relationship was seen for boys. Boys who were reported as having a more social dinner experience were more likely to have lower BMI, especially in families where everyone sat at the communal table until everyone finished eating. This relationship was found in parents as well.

“The ritual of where one eats and how long one eats seems to be the largest driver,” said Brian Wansink, director of the Cornell Food and Brand Lab. “In fact, eating anywhere other than the kitchen or dining room was related to higher BMIs in both parents and in children.”

The Cornell team emphasized that they found links between BMI and mealtime habits does not necessarily signify any cause-and-effect relationship. They added that their results emphasize the importance of the social sharing of a meal as a family. These familial interactions may have some kind of positive psychological effect on overeating habits, the researchers speculated.

In a press release, the Cornell food lab said family meals and their rituals might be an overlooked battleground for fighting obesity.

“If you want to strengthen your family ties and, at the same time keep a slimmer figure, consider engaging in a more interactive dinner experience,” the lab suggested. “A good place to start would be to eat together with the television off and then asking the kids to list their highlights of the day. After all, the dinner table does not just have to be a place where food gets eaten!”

Some previous studies have indicated that children begin to develop their social skills around a dinner table, providing one more reason to pick up a communal dinner habit.

Adobe Breach Affected 38 Million, Not 3 Million As Previously Thought

Michael Harper for redOrbit.com – Your Universe Online
Earlier this month Adobe acknowledged a security breach which left millions of customers’ names, passwords and encrypted credit and debit card information vulnerable. At the time, it was believed three million people were affected by this attack, but further digging by security researcher Brian Krebs has revealed this number is much higher. After finding a large file posted by hacking group Anonymous this weekend, Brian Krebs now says the number of people who had their information stolen from Adobe is closer to 38 million.
Adobe has now confirmed this number and says this only accounts for active users, and that millions of other IDs may have also been compromised. The attackers took more than customer information in their September attack — they also walked away with the source code to Adobe’s Acrobat and Cold Fusion products. Now, Krebs says the attackers were also able to obtain the source code to Photoshop, one of the most popular pieces of software used by millions every day.
“So far, our investigation has confirmed that the attackers obtained access to Adobe IDs and (what were at the time valid), encrypted passwords for approximately 38 million active users,” said Adobe spokesperson Heather Edell in a statement to KrebsOnSecurity, Brian Krebs’ security blog.
“We have completed email notification of these users. We also have reset the passwords for all Adobe IDs with valid, encrypted passwords that we believe were involved in the incident—regardless of whether those users are active or not.”
Originally it was believed the Photoshop source code was safe from this attack, but a file hosted online this weekend matches the file of the source code. Now Edell says this source code was stolen and that the company has been asking the sites hosting the code to take it down. AnonNews, the news website run by Anonymous, has been hosting this source code.
“Our investigation to date indicates that a portion of Photoshop source code was accessed by the attackers as part of the incident Adobe publicly disclosed on Oct. 3,” said Edell.
If in the wrong hands, the source code to any piece of software could become extremely dangerous. With this code, cybercriminals can write a specially-designed virus meant to take advantage of the software or even work in tandem with it. This means anyone using Photoshop is effectively in danger of having their computer infected with a malicious virus.
When Adobe announced the breach at the beginning of the month, they offered those customers who had their credit card information leaked free credit monitoring services. Ironically, the credit monitoring service is offered through Experian, itself a victim of a recent attack. A website called Superget.info once sold the private information of innocent people, including birthdays, social security numbers, drivers licenses, and more. A few weeks ago it was discovered the owners of this website managed to purchase this information directly from Experian.
The number of affected users has grown exponentially, but it could continue to grow even higher.
“Our investigation is still ongoing,” said Edell in the statement. “We anticipate the full investigation will take some time to complete.”

Massive Underground Dark Matter Detector Comes Up Empty Handed

Brett Smith for redOrbit.com – Your Universe Online

Just three months into its operation the Large Underground Xenon (LUX) experiment is already the most sensitive dark matter detector in the world, according to scientists from Brown University.

“LUX is blazing the path to illuminating the nature of dark matter,” said Rick Gaitskell, professor of physics at Brown University and project spokesperson.

The detector sits over a mile below the Sanford Underground Research Facility in South Dakota. Deep in the Earth, the device is better able to detect the uncommon, weak interactions between dark matter particles and ordinary matter, according to project researchers. The results of the project’s initial 90-day run were announced during a seminar on Wednesday at the Sanford Lab in Lead, SD.

“What we’ve done in these first three months of operation is look at how well the detector is performing, and we’re extremely pleased with what we’re seeing,” Gaitskell said. “This first run demonstrates a sensitivity that is better than any previous experiment looking to detect dark matter particles directly.”

The leading theoretical candidates for dark matter particles are called weakly interacting massive particles (WIMPs). The LUX scientists said WIMPs could be either in a high-mass or low-mass form.

The results from the initial run suggest LUX has twice the sensitivity of any other dark matter detection experiment to detect high-mass WIMPs. The new results indicate potential detections of low-mass WIMPS by other experiments were probably the result of background radiation, not dark matter, the researchers said.

“There have been a number of dark matter experiments over the last few years that have strongly supported the idea that they’re seeing events in the lowest energy bins of their detectors that could be consistent with the discovery of dark matter,” Gaitskell explained. “With the LUX, we have worked very hard to calibrate the performance of the detector in these lowest energy bins, and we’re not seeing any evidence of dark matter particles there.”

The LUX researchers are set to begin a 300-day run that could either definitively find dark matter or rule out a vast area of  space where it could have existed.

“Every day that we run a detector like this we are probing new models of dark matter,” Gaitskell said. “That is extremely important because we don’t yet understand the universe well enough to know which of the models is actually the correct one. LUX is helping to pin that down.”

To capture the incredibly rare interactions that would signify the presence of dark matter, the LUX workers have engineered a highly-sensitive detector. The most important part of the LUX is a third of a ton of supercooled xenon in a tank outfitted with light sensors. These sensors are capable of detecting a single photon at a time. When a particle interrelates with the xenon, it creates a miniscule flash of light and an ion charge, both of which can be seen by the sensors.

In order to minimize background interactions, the detector had to be protected from background radiation and cosmic rays, which is why the LUX sits nearly 4,900 feet underground, in 71,600 gallons of pure de-ionized water.

“LUX is a huge step forward. Within the first few minutes of switching it on, we surpassed the sensitivity of the first dark matter detectors I was involved with 25 years ago,” Gaitskell said. “Within a few days, it surpassed the sensitivity of sum total of all previous dark matter direct search experiments I have ever worked on. This first LUX run is more sensitive than any previous search conducted and now sets us up perfectly for the 300-day run to follow.”

“We are very excited that our thesis work has culminated in this world-leading result,” said Jeremy Chapman, a graduate student and LUX researcher.

Monitor Stress Levels With New Smart Wristband From Airo

Enid Burns for redOrbit.com – Your Universe Online
Airo, a new addition to the smart wearables market, was introduced earlier this week and takes activity monitoring a step further. The newly developed wristband measures exercise, sleep, nutrition and, apart from other recent smart wearables on the market, stress.
While a number of comparable fitness-monitoring bands measure activity, sleep and even nutrition to some degree, the Airo wristband is able to determine stress levels based on heart rate. The band also looks at nutrition based on wavelengths of light in the bloodstream, which is more advanced than other monitors on the market — some of which require the user to alert the monitor to food intake manually.
“We built Airo to help people become more proactive about their health. By aggregating data around the four pillars of health, Airo notices patterns in your behavior and tells you what you can do, each day, to live a healthier life,” said Abhilash Jayakumar, co-founder and CEO of Airo Health, in a corporate statement. “We’re excited about giving people the ability to take control of their health in a way that has never before been possible.”
The four pillars identified by Airo Health’s Jayakumar are nutrition, stress, exercise and sleep. The Airo wristband approaches the four pillars differently than other wristbands in the wearables market.
As far as monitoring nutrition, Airo Health said the wristband is able to detect metabolites as they are released during and after eating. By looking at the bloodstream, Airo is able to measure caloric intake and is also able to gather data on the quality of the meal consumed. Using the data, Airo is also able to provide recommendations on ways to improve nutrition.
Airo has an advantage in the nutrition category over comparable devices already on the market. The Fitbit Force, which was recently announced by Fitbit and will sell for $129, requires users to manually enter meals as well as certain exercises, reports Business Insider.
Measurement of stress is one feature that offers the Airo wristband a distinction over other devices on the market as well. The Airo uses heart rate variability (HRV) to monitor micro-fluctuations in stress throughout the day. Airo notifies users when stress levels cross a personal threshold. The monitor also offers recommendations — most likely via a website — to help the user take steps to help recover from stress.
Exercise, however, is where the wearable device market excels. The Airo approaches this type of measurement differently, however. Instead of tracking steps, the device monitors daily exertion through heart rate and caloric burn. Airo takes this data and metrics to detect how intense the activity has been and will also check to see how the body has recovered from activity over the course of a few days.
Sleep is somewhat of a new feature on several wearable devices. The Airo wristband, following the path, works as a sleep monitor. It looks at the autonomic nervous system to identify sleep cycles. It recognizes when the body is in deep sleep, light sleep and REM sleep, and then calculates how much of a user’s night was restorative.
The newly announced Airo wristband is produced by Ontario, Canada-based Airo Health. The band is expected to become available in the fall of 2014 and the company is currently taking pre-orders for $149, increasing to $199 when the device is launched.
While its entry isn’t expected until fall, the Airo wristband is part of an emerging market of wearable technology that is experiencing rapid growth. A recent report from Juniper Research forecasts the wearable device market to reach $19 billion by 2018. The Airo wristband, which won’t be released for another year, will contribute to the industry, helping to reach those sales numbers.

Results Of The SAFE-PCI Trial Presented At TCT 2013

Novel study finds using radial versus femoral access during cardiac catheterization may have benefits in women

A clinical trial conducted exclusively in women suggests that an initial strategy of using the radial artery in the arm as the entry point for cardiac catheterization or percutaneous coronary intervention (PCI) in women has potential for reducing bleeding complications. SAFE-PCI for Women is the first registry-based randomized trial in the United States and the first multicenter trial comparing radial with femoral access in the U.S., and its primary findings were presented today at the 25th annual Transcatheter Cardiovascular Therapeutics (TCT) scientific symposium. Sponsored by the Cardiovascular Research Foundation (CRF), TCT is the world’s premier educational meeting specializing in interventional cardiovascular medicine.

Women are at particular risk for bleeding and vascular complications after PCI. While a transradial approach can potentially reduce these complications, this technique has never been prospectively studied in women. Women also have smaller radial arteries than men making radial PCI potentially more challenging.

The SAFE-PCI for Women Trial randomized 1,787 women undergoing elective PCI, urgent PCI or diagnostic catheterization with possible PCI to either a radial or femoral approach. In a novel approach designed to minimize trial costs, the trial used prospectively gathered data collection instruments based upon the existing National Cardiovascular Research Infrastructure (NCRI), a clinical trial infrastructure created through collaboration between the National Heart, Lung, and Blood Institute (NHLBI), the American College of Cardiology, and the Duke Clinical Research Institute. It was also built on the NCDR CathPCI Registry®, the largest ongoing PCI registry in the world.

The primary efficacy endpoint was bleeding (BARC Types 2, 3 or 5) or vascular complications requiring intervention within 72 hours post-procedure or at hospital discharge, whichever came first. The primary feasibility endpoint was procedural failure defined as the inability to complete the PCI from the assigned access site.

After 1,120 patients had been randomized, 446 of whom had undergone PCI, review of data by the Data and Safety Monitoring Board (DSMB) showed that the primary efficacy event rate was markedly lower than expected. The DSMB recommended termination of the trial because the trial was unlikely to show a difference at the planned sample size. No harm was noted in either arm; therefore, the Steering Committee voted to continue the study until enrollment in a quality-of-life sub study was complete.

A total of 1,787 patients were randomized (893 to radial access, 894 to femoral access), 691 of those who underwent PCI (345 radial access and 346 femoral access). In the PCI group, bleeding and complication rates were 1.2 percent in the radial group compared to 2.9 percent in the femoral group (p=0.12). When assessing the overall cohort of randomized patients (those receiving both diagnostic procedures alone as well as those receiving PCI), bleeding and complication rates were 0.6 percent vs. 1.7 percent (p=0.03). The overall procedural failure rate was 6.7 percent in the radial group and 1.9 percent in the femoral group (p<0.001). Within the radial group, conversion to femoral access was often due to radial artery spasm (42.9 percent among the patients who converted).

“The SAFE-PCI for Women trial represents several ‘firsts.’ It is the first randomized trial of interventional strategies in women, the first multicenter randomized trial comparing radial with femoral access in the United States, and the first registry-based randomized trial in the U.S. The treatment benefit of radial access over femoral access was larger than expected (~60 percent) in both the PCI group and total randomized cohorts,” said lead investigator Sunil V. Rao, MD. Dr. Rao is an Associate Professor of Medicine at Duke University Medical Center.

“Findings suggest that an initial strategy of radial access is reasonable and may be preferred in women, with the recognition that a proportion of patients will require bailout to femoral access,” Dr. Rao added.

“As the first registry-based randomized trial in the United States, the SAFE-PCI for Women trial demonstrates a new paradigm shift for conducting efficient practical clinical trials using the National Cardiovascular Research Infrastructure. This trial construct is a promising approach for future clinical investigations.”

On the Net:

How A Metamaterial Might Improve A Depression Treatment

A brain stimulation technique that is used to treat tough cases of depression could be considerably improved with a new headpiece designed by University of Michigan engineers.

Computer simulations showed that the headpiece—a square array of 64 circular metallic coils—could one day help researchers and doctors hit finer targets in the brain that are twice as deep as they can reach today, and without causing pain.

In transcranial magnetic stimulation, special coils create a fluctuating magnetic field that then generates a weak electrical field that can travel through the scalp and skull noninvasively. The electrical signal activates neurons in targeted parts of the brain—a complex electrical network itself.

Exactly how the technique alleviates depression isn’t well understood, but it tends to reduce symptoms in roughly half of patients who don’t respond to antidepressants. It’s been an FDA-approved mental illness treatment since 2006, but researchers say the technology is still relatively crude.

It can send signals only 2 centimeters into the brain before it causes uncomfortable muscle contractions in a patient’s scalp. It’s not the depth that causes the contractions. It’s the relatively large focal spot required to go in that far.

“I started working on transcranial magnetic stimulation a while ago and realized the technology was very limited,” said Luis Hernandez-Garcia, a research associate professor of biomedical engineering and co-author of a paper on the work published in the October edition of IEEE Transactions on Biomedical Engineering.

“It was an uphill battle to get it where I wanted. If you wanted to reach deep in the brain, you also had to stimulate a lot of other brain regions that you really didn’t want to stimulate.”

To treat depression more effectively, it’s been hypothesized that the signal should reach beyond 2 centimeters. In simulations, at 2.4 centimeters, the new system excited 2.6 times less unwanted brain volume than today’s systems. It can go deeper as well.

“That improvement isn’t marginal,” said co-author Eric Michielssen, a professor of electrical engineering and computer science. “This should open up a lot of opportunities to treat depression and other mental illness, as well as probe the brain.”

This type of neural stimulation is also a tool in the cognitive neuroscience field where researchers study how brain biology leads to thoughts and actions. For example, a neuroscientist can use it to activate regions of the brain thought to be responsible for hand movement and then watch for a hand response to confirm the hypothesis. To advance this field of study, scientists need the ability to zero in on smaller parts of the brain.

“Our coils will enable neuroscientists to further understand brain function by achieving a finer resolution and stimulating regions deeper in the brain,” said Luis Gomez, a doctoral student in electrical engineering and computer science and first author of the paper. “It could help us figure out which way information flows inside neural networks, and ultimately understand how the brain works.”

The headpiece design is a big departure from today’s figure 8-shaped devices made of just two coils. The researchers knew that in order to send a sharper signal, they’d have to change the shape of the fluctuating magnetic field the coils produce. To design a device that could do that, they worked with Anthony Grbic, an associate professor of electrical engineering and computer science who specializes in metamaterials—a class of substances tailored to exhibit specific electromagnetic properties that can, for example, focus light to a point smaller than its wavelength. Grbic suggested that a surface of loops could do the job.

“These coil arrays are sub-wavelength structures—textured devices designed to manipulate the magnetic near-field in ways that people have never imagined before,” said Grbic, the Ernest and Betty Kuh Distinguished Faculty Scholar.

The prototype needs only one power source, as opposed to 64. Other so-called multichannel arrays require a power source for each coil. Having just one would make it easier to use on patients and more affordable.

On the Net:

California Redwoods Hold Hidden History Of Coastal Climate

[ Watch the Video: Redwoods Tell The Tale Of Past Climate Change ]

redOrbit Staff & Wire Reports – Your Universe Online

Giant redwood trees can provide a glimpse into historic coastal climate conditions using a novel method for analyzing the wood’s oxygen and carbon content to detect fog and rainfall in previous seasons, according to a study published in the Journal of Geophysical Research: Biogeosciences.

Study leader Jim Johnstone at the University of Washington developed the new method, and used the cores from Northern California coastal redwoods to trace the climate back 50 years.

“This is really the first time that climate reconstruction has ever been done with redwoods,” Johnstone said.  Weather records corroborate Johnstone’s findings, proving that the method is accurate and suggesting that it could be used to track conditions throughout the thousand or more years of the redwoods’ lifetime.

Coastal redwoods are not the longest-lived trees on the West Coast, but they do contain unique information about their foggy surroundings.

“Redwoods are restricted to a very narrow strip along the coastline,” Johnstone explained. “They’re tied to the coastline, and they’re sensitive to marine conditions, so they actually may tell you more about what’s happening over the ocean than they do about what’s happening over land.”

Many people have used tree rings as a window into the past, but until now the redwoods were seen as too erratic in their growth patterns to be used to reconstruct historic climate. Tree-ring research, or dendrochronology, typically involves a detailed look at a cross-section of a tree trunk. However, the rings of redwoods are uneven, and sometimes fail to fully encircle the tree, making it a poor candidate for anything except detecting historic fires.

By contrast, Johnstone’s painstaking approach is more akin to processing ice cores, because it uses the molecules captured in the wood to sample the atmosphere of the past. Most oxygen in Earth’s atmosphere has an atomic mass of 16, making it O-16, but a small percentage of oxygen is the heavier O-18 isotope. When seawater evaporates off the ocean to form clouds, some drops fall as rain over the ocean, and more of the heavier O-18 molecules rain out. As a result, the remaining drops that fall on land have a higher percentage of the lighter O-16 molecules. But fog, on the other hand, forms near the shore, and blows on land where it drips down through the branches until the trees use it like rainwater.

By analyzing the proportion of O-16 and O-18 in the wood from each season, Johnstone and his team were able to measure the contribution of fog and rain. The researchers examined the spring growth from April to June, as well as the fall growth from August to October, and also analyzed carbon atoms to measure the total amount of moisture in the air.

“We actually have two indicators that we can use in combination to determine if a particular summer was foggy with a little rain, foggy with a lot of rain, and various combinations of the two,” Johnstone said.

Related research by Johnstone shows that the amount of West Coast fog is closely tied to the surface temperature of the ocean, meaning redwoods may reveal information about the long-term patterns of ocean change, such as the Pacific Decadal Oscillation. Understanding of the natural variability cycles could also help to better distinguish natural and human-caused climate change.

“It’s possible that the redwoods could give us direct indication of how that’s worked over longer periods,” Johnstone said. “This is just a piece that contributes to that understanding in a pretty unique place.”

Pancreatic Beta Cells May Be Potential New Target For Diabetes Treatment

redOrbit Staff & Wire Reports – Your Universe Online

A fat recycling system contained within pancreatic beta cells helps regulate the amount of insulin they secrete, making them a potential target for new and novel ways to treat diabetes, according to new research appearing online in the journal Diabetologia.

The pancreas is the organ that produces the precise amount of insulin needed by our bodies when we eat, but when a person develops diabetes, the production of the hormone begins to slow down, the study authors explain.

A small structure inside the pancreatic beta cells, known as the lysosome, helps break down unwanted fats and proteins so that they can be re-used. The study authors call it “an intracellular recycling unit,” but when lysosomes are prevented from breaking down fat, the beta cells actually secreted more insulin.

According to study authors Gemma Pearson and Trevor Biden of the Garvan Institute of Medical Research in Australia, their work is in its earliest stages. However, thus far, the results appear promising, and could provide the medical community with a new approach for treating diabetes.

“There are many different ways fats can be used within the beta cell – so if you stop them being recycled, you force them to be used in a different way,” Pearson, a doctoral student, said in a statement Tuesday. “When you shift fats from the lysosome, you store them in other parts of the cell, and they become available to participate in various signaling pathways. One of these pathways clearly increases insulin secretion.”

“Fat molecules… can bind to proteins and activate them, causing a range of downstream events to occur,” she added. “The good thing about this particular pathway is that it is only stimulated by glucose. That limits the beta cell to producing excess insulin only to deal with food, rather than around the clock. Too much insulin circulating in the blood, or hyperinsulinaemia, can be very detrimental to health in many respects”.

Should a drug ultimately be developed in order to block fat degradation from occurring in the lysosome, Pearson explained that it would have to be fine-tuned so that it only had an effect on beta cells.

Earlier this month, researchers from the Tufts Medical Center announced that they had received a $40 million grant from the National Institutes of Health (NIH) in order to settle the ongoing debate as to whether or not vitamin D can be beneficial in treating diabetes.

According to Dr. Anastassios Pittas, who is leading the new study, there is currently little clinical evidence suggesting that the supplement could help prevent or delay type 2 diabetes. He hopes that his study, which will span several years’ time and feature 2,500 participants, will finally put an end to the debate.

“Past observational studies have suggested that higher levels of vitamin D may be beneficial in preventing type 2 diabetes, but until this large, randomized and controlled clinical trial is complete, we won’t know if taking vitamin D supplements lowers the risk of diabetes,” Dr. Pittas said in a statement.

Rhinovirus C Structure Has Inhibited Search For Common Cold Cure

[ Watch the Video: Human Rhinovirus C15 Model Suggests Novel Topography ]

redOrbit Staff & Wire Reports – Your Universe Online

Thanks to the genetic sequencing of the so-called “missing link” cold virus, researchers have created a three-dimensional model of the pathogen that sheds new light on why there is currently no cure for the common cold.

Writing in Monday’s edition of the journal Virology, University of Wisconsin-Madison biochemistry professor Ann Palmenberg and her colleagues created a topographical model of the protein shell (or capsid) of rhinovirus C, a type of cold virus that was unknown to the medical community until seven years ago.

According to the study authors, rhinovirus C is believed to be responsible for as many as half of all colds among children, and is also considered to be a serious complicating factor for asthma and other respiratory ailments. Rhinovirus C, along with the A and B versions of the same pathogen, are reportedly the cause of millions of illnesses each year, with an annual cost topping $40 billion dollars in the US alone.

The research is described as important, as it creates an exceptionally detailed structural model of the cold virus and demonstrates that it has a different protein shell than other strains of cold viruses. Palmenberg said that the discovery “explains most of the previous failures of drug trials against rhinovirus.”

“The A and B families of cold virus, including their three-dimensional structures, have long been known to science as they can easily be grown and studied in the lab,” the university explained in a statement. “Rhinovirus C, on the other hand, resists culturing and escaped notice entirely until 2006 when ‘gene chips’ and advanced gene sequencing revealed the virus had long been lurking in human cells alongside the more observable A and B virus strains.”

The new cold virus model was constructed using a computer simulation, and took advantage of both advanced bioinformatics and the genetic sequences of 500 rhinovirus C genomes – the latter of which provided the three-dimensional coordinates of the viral capsid, the investigators said.

Previously, pharmaceutical firms attempting to design drugs to combat the common cold had little to work with due to the lack of a 3D rhinovirus C model. Palmenberg, who was a member of the team that first mapped the genomes of all known common cold viruses in 2009, called it a “very high-resolution model” that “fits the data.”

“With a structure in hand, the likelihood that drugs can be designed to effectively thwart colds may be in the offing,” the university said. “Drugs that work well against the A and B strains of cold virus have been developed and advanced to clinical trials. However, their efficacy was blunted because they were built to take advantage of the surface features of the better known strains, whose structures were resolved years ago through X-ray crystallography, a well-established technique for obtaining the structures of critical molecules.”

Since all three cold virus strains contribute to the illness, potential medicines developed to combat the common cold failed. This is because drug makers did not fully understand the surface features which allowed rhinovirus C to dock with host cells and avoid a person’s immune system .

Those features are different in rhinovirus C than they are in A and B, and none of the drugs tested by the research team were found to be effective. For that reason, the authors have concluded that pharmaceutical firms will need to develop a substance that specifically targets rhinovirus C – a feat which should now be possible, thanks to the 3D rhinovirus C model produced through their efforts.

Image 2 (below): Two faces of the common cold. The protein coat of the “missing link” cold virus, Rhinovirus C (right), has significant differences from the more observable and better studied Rhinovirus A. Those differences explain why no effective drugs have yet been devised to thwart the common cold.

Using Genetic Algorithms To Discover New Nanostructured Materials

Researchers at Columbia Engineering, led by Chemical Engineering Professors Venkat Venkatasubramanian and Sanat Kumar, have developed a new approach to designing novel nanostructured materials through an inverse design framework using genetic algorithms. The study, published in the October 28 Early Online edition of Proceedings of the National Academy of Sciences (PNAS), is the first to demonstrate the application of this methodology to the design of self-assembled nanostructures, and shows the potential of machine learning and “big data” approaches embodied in the new Institute for Data Sciences and Engineering at Columbia.
“Our framework can help speed up the materials discovery process,” says Venkatasubramanian, Samuel Ruben-Peter G. Viele Professor of Engineering, and co-author of the paper. “In a sense, we are leveraging how nature discovers new materials—the Darwinian model of evolution—by suitably marrying it with computational methods. It’s Darwin on steroids!”
Using a genetic algorithm they developed, the researchers designed DNA-grafted particles that self-assembled into the crystalline structures they wanted. Theirs was an “inverse” way of doing research. In conventional research, colloidal particles grafted with single-stranded DNA are allowed to self-assemble, and then the resulting crystal structures are examined. “Although this Edisonian approach is useful for a posteriori understanding of the factors that govern assembly,” notes Kumar, Chemical Engineering Department Chair and the study’s co-author, “it doesn’t allow us to a priori design these materials into desired structures. Our study addresses this design issue and presents an evolutionary optimization approach that was not only able to reproduce the original phase diagram detailing regions of known crystals, but also to elucidate previously unobserved structures.”
The researchers are using “big data” concepts and techniques to discover and design new nanomaterials—a priority area under the White House’s Materials Genome Initiative—using a methodology that will revolutionize materials design, impacting a broad range of products that affect our daily lives, from drugs and agricultural chemicals such as pesticides or herbicides to fuel additives, paints and varnishes, and even personal care products such as shampoo.
“This inverse design approach demonstrates the potential of machine learning and algorithm engineering approaches to challenging problems in materials science,” says Kathleen McKeown, director of the Institute for Data Sciences and Engineering and Henry and Gertrude Rothschild Professor of Computer Science. “At the Institute, we are focused on pioneering such advances in a number problems of great practical importance in engineering.”
Venkatasubramanian adds, “Discovering and designing new advanced materials and formulations with desired properties is an important and challenging problem, encompassing a wide variety of products in industries addressing clean energy, national security, and human welfare.” He points out that the traditional Edisonian trial-and-error discovery approach is time-consuming and costly—it can cause major delays in time-to-market as well as miss potential solutions. And the ever-increasing amount of high-throughput experimentation data, while a major modeling and informatics challenge, has also created opportunities for material design and discovery.
The researchers built upon their earlier work to develop what they call an evolutionary framework for the automated discovery of new materials. Venkatasubramanian proposed the design framework and analyzed the results, and Kumar developed the framework in the context of self-assembled nanomaterials. Babji Srinivasan, a postdoc with Venkatasubramanian and Kumar and now an assistant professor at IIT Gandhinagar, and Thi Vo, a PhD candidate at Columbia Engineering, carried out the computational research. The team collaborated with Oleg Gang and Yugang Zhang of Brookhaven National Laboratory, who carried out the supporting experiments.
The team plans to continue exploring the design space of potential ssDNA-grafted colloidal nanostructures, improving its forward models, and bring in more advanced machine learning techniques. “We need a new paradigm that increases the idea flow, broadens the search horizon, and archives the knowledge from today’s successes to accelerate those of tomorrow,” says Venkatasubramanian.

On the Net:

Blame Britain’s Bad Weather On Retreating Arctic Ice

Brett Smith for redOrbit.com – Your Universe Online
Britons never seem to get a break from bad weather, and now one scientist believes he may have found a culprit to point the finger at: retreating Arctic sea ice.
According to a new study from the University of Exeter’s James Screen that was published in the journal Environmental Research Letters, the run of wet summers that hit northwest Europe between 2007 and 2012 was partially due to the loss of Arctic sea ice that shifted the jet stream further south than normal.
Jet streams are currents of strong winds high in the atmosphere that steer weather systems and precipitation. In the past, the jet stream has flowed between Scotland and Iceland, sending bad weather north of Britain.
“The results of the computer model suggest that melting Arctic sea ice causes a change in the position of the jet stream and this could help to explain the recent wet summers we have seen,” Screen said. “The study suggests that loss of sea ice not only has an effect on the environment and wildlife of the Arctic region but has far reaching consequences for people living in Europe and beyond.”
Screen added that other possible factors could have helped to cause the recent run of wet summers, including the natural decade-long oscillations in the surface temperatures of the Atlantic Ocean. He emphasized that sea ice and the jet stream could explain up to a third of the trend towards wetter summers, according to his study.
“For the run of wet summers that we’ve had, the loss of sea ice and warm ocean temperatures were both acting in the same direction – pulling the jet stream south,” Screen said.
“We expect that sea surface temperatures will return to a cooler phase of their natural cycle – reducing the risk of wetter summers – but Arctic sea ice is expected to continue to melt and it is unclear how these competing effects will influence weather over the next decade or so,” he added.
Screen’s computer model considered weather patterns from when the summer sea-ice was relatively wide-ranging, as it was in the 1970s, as well as the sea ice of recent years. The model indicated that sea ice alone can affect the direction the jet stream takes as it flows from west to east.
“We have some confidence that the computer models are showing the real thing,” Screen told The Independent. “The pattern of summer rainfall we see in the computer models, when the only factor we change is the amount of sea ice in the Arctic, is very similar to what we’ve experienced over the past few years.”
The UK scientist emphasized that his model simply indicates a weather mechanism and should be correlated to specific weather events.
“I don’t think we’ll ever get to a position where we can link a specific event like the 2012 floods to climate change,” he said. “But we might be able to say that these changes are tipping the balance in favor of particular weather regimes, and this research is a step in that direction.”

Most Americans Want To See Labels On Their Nanofoods

redOrbit Staff & Wire Reports – Your Universe Online
Americans overwhelmingly want to know when they are eating food products that use nanotechnology, and are happy to pay the additional labeling costs, according to a new study published this month in the journal Review of Policy Research.
“Our study is the first research in the United States to take an in-depth, focus group approach to understanding the public perception of nanotechnology in foods,” said Dr. Jennifer Kuzma of North Carolina State University, the study’ s senior author. “We wanted to know whether people want nanotechnology in food to be labeled, and the vast majority of the participants in our study do.”
The researchers convened six focus groups – three in Minnesota and three in North Carolina – and asked the study participants basic questions about nanotechnology and its use in food products. The participants were then asked a series of questions addressing whether food nanotechnology should be labeled. Finally, the participants were sent a follow-up survey within a week of their focus group meeting.
The researchers found that the participants were particularly supportive of new labeling for products in which nanotechnology had been added to the food itself, and also favored labeling products in which nanotechnology had only been incorporated into the food packaging.
However, the researchers note that the call for labeling does not indicate that people are necessarily opposed to the use of nanotechnology in food products. For example, many participants voiced support for the use of nanotechnology to make food more nutritious, or to give it a longer shelf life, although they still wanted those products to be labeled.
“People do have nuanced perspectives on this,” Kuzma said. “They want labeling, but they also want access to reliable, research-based information about the risks associated with labeled products – such as a Food and Drug Administration website offering additional information about labeled products.”
The results of the follow-up survey revealed that roughly 60 percent of the participants were willing to pay an additional five to 25 percent of the product price for either nanotechnology-free products or for nanotechnology labeling.
The research is important because some safety studies on food nanotechnology are emerging that suggest nanomaterials in food can absorb through a healthy gastrointestinal tract, which may produce systemic adverse effects and decrease nutrient absorption, the researchers said.
However, the true extent of any adverse effects of nanomaterials in food will remain uncertain for some time given the lack of human risk-relevant studies.
“Not all nanofood products will be hazardous to human health, though it could be argued that until more testing is done, nanofoods warrant additional labeling on the basis of safety, given the special penetration and reactivity properties of nanomaterials in biological systems,” reads the report.
The researchers say additional studies are needed to further explore the results of the current study.
“The impetus for this work is the view that consumers deserve a voice in decisions concerning food products that affect them and that they may ultimately decide the fate of nanofoods,” the researchers said.
“The results from this study lay groundwork on a wide range of topics to consider for nanofood labeling policy, such as public preferences related to labeling content, labeling characteristics, willingness to pay, willingness to avoid, and information concerning risk and safety,” they added.
There is currently no specific requirement for nanofoods to undergo a mandatory FDA premarket approval processes. However, the regulatory agency issued draft guidelines more than a year ago on the use of nanotechnology in both food and cosmetics.

Mapping Human Impacts On Top Marine Predators

April Flowers for redOrbit.com – Your Universe Online
One of the richest ecosystems in the world, the California Current System, is driven by nutrient input from coastal upwelling and supports a great diversity of marine life. It is also heavily impacted by human activities, much like other coastal regions.
Researchers from the University of California, Santa Cruz (UCSC), reveal areas along the west coast where human impacts on marine predators such as whales, seals, seabirds and sea turtles are the heaviest.
Many of the high impact areas are within the boundaries of National Marine Sanctuaries, according to the study published in Nature Communications.
Sara Maxwell, who led the study as a graduate student in ocean sciences at UCSC and is now a postdoctoral scholar at Stanford University’s Hopkins Marine Station, says this means there are good opportunities for improving management strategies.
“The sanctuaries are located close to the coast in areas where there are a lot of human activities and a lot of marine life, so it’s not surprising that we see a lot of impacts there,” Maxwell said. She notes that oil spills were a big concern when the sanctuaries were established, and many do not limit activities such as fishing, although they are actively engaged in managing industries such as shipping.
“With the sanctuaries already in place, we have an opportunity to increase protections. The results of this study allow us to be more specific in where we focus management efforts so that we can minimize the economic impact on people,” she said.
Along the west coast, five National Marine Sanctuaries cover nearly 15,000 square miles. Protection would be expanded north to Point Arena,a key area the study identified, if a proposed expansion goes through of the Gulf of the Farallones and Cordell Bank National Marine Sanctuaries.
The health of marine ecosystems depends on marine mammals and other predators. The researchers analyzed tracking data for eight species of marine predators: blue whales, humpback whales, northern elephant seals, California sea lions, black-footed and Laysan albatrosses, sooty shearwaters, and leatherback sea turtles. These eight marine animals were drawn from the 23 species of marine predators whose movements have been tracked since 2000 as part of the Tagging of Pacific Predators (TOPP) program. Maxwell said the eight species were chosen because they are ecologically important but not commercially exploited.
The TOPP data reveals that many marine predator species travel thousands of miles every year. Even so, they often concentrate within small-scale “hotspots” to breed or feed on fish and other prey. The California Current System includes many such hotspots.
The TOPP data was combined with a database of human impacts in the California Current System that was developed by a group led by coauthor Benjamin Halpern at UC Santa Barbara. Twenty-four stressors — fishing, shipping, climate change, etc. — associated with human activities were analyzed for relative impact on each species. This analysis yielded maps showing the areas of greatest impact for each of the species.
“Areas where key habitats and human impacts overlap represent important areas for conservation efforts,” Maxwell said. “In other cases, areas of high human activities are not key habitats for predators. As a result, we can maximize both conservation of marine predators and human uses that our coastal communities depend on.”
The research team suggests that protecting key habitat without considering the impact of human activity results in missed opportunities for sustainable resource use. “Having this detailed spatial information will help us move toward a more sustainable management approach,” said Elliott Hazen, a research biologist at UCSC and the NOAA Southwest Fisheries Science Center.
One of the goals of the TOPP program is providing information to support management and policy decisions, using sophisticated tags with satellite- or light-based geolocation capabilities to track the movements of top predators throughout the Pacific Ocean.
“A major component of the TOPP program was to identify important conservation areas of the North Pacific Ocean. This paper is a significant step forward in increasing our awareness of the ‘blue Serengeti’ that lies just off the west coast of the U.S.,” said Dan Costa, one of the co-founders of TOPP.

Survey Suggests Grassroots Effort Could Save The Monarch Butterfly

redOrbit Staff & Wire Reports – Your Universe Online

US citizens love their monarch butterflies – so much so that they are apparently willing to contribute at least $4.78 billion dollars to conservation organizations working to protect the declining species, according to research published Monday in the journal Conservation Letters.

As part of the study, experts from the US Geological Survey (USGS), Colorado State University, the University of Minnesota, and others conducted a survey of American households. They found that, through a combination of payments and donations, they would be willing to contribute a one-time sum of up to $6.64 billion towards monarch conservation efforts – an amount close to what is contributed toward many endangered vertebrate species.

“Protecting migratory species is complex because they cross international borders and depend on multiple regions. Understanding how much, and where, humans place value on migratory species can facilitate market-based conservation approaches,” the study authors wrote. “We performed a contingent valuation study of monarchs to understand the potential for such approaches to fund monarch conservation.”

The survey asked US residents about the amount of money they would be willing to donate to animal protection groups working to help the butterfly, as well as the amount they would be willing to spend (or had already spent) growing plants that were monarch-friendly. If even a small percentage of those households made good on those promises, they said, it could generate considerable additional funding for monarch conservation efforts.

“The multigenerational migration of the monarch butterfly is considered one of the world’s most spectacular natural events,” lead author and USGS scientist Jay Diffendorfer said in a statement.

According to the USGS, monarch butterfly populations have been declining throughout much of the US, as well as Canada and Mexico, since 1999. A survey conducted last year of the wintering grounds of the butterfly species in Mexico showed the lowest colony size ever recorded, the researchers said. One factor believed to be playing a major role in the declining population numbers is the loss of milkweed, which monarch caterpillars feed upon.

“While many factors may be affecting monarch numbers, breeding, migrating, and overwintering habitat loss are probably the main culprits,” said University of Minnesota monarch biologist Karen Oberhauser, one of the study’s co-authors. “In the US, the growing use of genetically-modified, herbicide-tolerant crops, such as corn and soybeans, has resulted in severe milkweed declines and thus loss of breeding habitat.”

“This is the first nation-wide, published, economic valuation survey of the general public for an insect. The study indicates that economic values of monarch butterflies are potentially large enough to mobilize people for conservation planting and funding habitat conservation,” added John Loomis of Colorado State University, who served as the lead economist on the study.

He and his colleagues suggest that their findings could lead to the emergence of a market for plants that are friendly to the monarch, which is the official insect or butterfly of seven different US states. After all, according to the National Gardening Association’s annual survey, households that identified themselves as “do-it-yourselfers” spent more than $29 billion in related retail sales last year.

“The life cycle of monarchs creates opportunities for untapped market-based conservation approaches. Ordinary households, conservation organizations, and natural resource agencies can all plant milkweed and flowering plants to offset ongoing losses in the species’ breeding habitat,” Diffendorfer said.

“By reallocating some of those purchases to monarch-friendly plants, people would be able to contribute to the conservation of the species as well as maintain a flower garden,” he added. “Helping restore the monarch’s natural habitat, and potentially the species’ abundance, is something that people can do at home by planting milkweed and other nectar plants.”

Excessive Omega-3 Fatty Acid Intake Could Be Harmful To Your Health

redOrbit Staff & Wire Reports – Your Universe Online
While the consumption of omega-3 fatty acids has been previously associated with positive health effects, new research the journal Prostaglandins, Leukotrienes & Essential Fatty Acids suggests that excessive amounts of the substances could have negative consequences.
Previous research has linked omega-3s with a reduced risk of cardiovascular death and other cardiovascular disease outcomes, and WebMD reports that there is “strong evidence” that they can help lower triglycerides and blood pressure, the researchers believe that health organizations should establish dietary standards based on the best currently available evidence.
“What looked like a slam dunk a few years ago may not be as clear cut as we thought,” study co-author Norman Hord, an associate professor in the Oregon State University’s College of Public Health and Human Sciences, said in a statement Monday.
“We are seeing the potential for negative effects at really high levels of omega-3 fatty acid consumption,” he added. “Because we lack valid biomarkers for exposure and knowledge of who might be at risk if consuming excessive amounts, it isn’t possible to determine an upper limit at this time.”
Three years ago, research led by Michigan State University assistant professor Jenifer Fenton found that feeding mice large amounts of dietary omega-3 fatty acids (also known as long chain polyunsaturated fatty acids or LCPUFAs) could increase those rodents’ risk of immune system alteration and colitis. The newly published paper is a follow-up to that research.
As part of the new study, Fenton, Hord and their colleagues reviewed literature pertaining to omega-3 fatty acids, and discussed the potential adverse health outcomes that could potentially result from excessive LCPUFA consumption. Fenton explained that their work was inspired by recently published studies demonstrating “increased risk of advanced prostate cancer and atrial fibrillation in those with high blood levels of LCPUFAs,” Fenton said.
Omega-3 fatty acids have been found to have anti-inflammatory properties, which is one of the reasons that they are believed to be effective against inflammation and beneficial to heart health, the researchers said. However, they also found that excess amounts of LCPUFAs can negatively impact immune function, potentially leading to a dysfunctional response to viral or bacterial pathogens.
“Generally, the researchers point out that the amounts of fish oil used in most studies are typically above what one could consume from foods or usual dosage of a dietary supplement,” the university said. “However, an increasing amount of products, such as eggs, bread, butters, oils and orange juice, are being ‘fortified’ with omega-3s. Hord said this fortified food, coupled with fish oil supplement use, increases the potential for consuming these high levels.”
“Overall, we support the dietary recommendations from the American Heart Association to eat fish, particularly fatty fish like salmon, mackerel, lake trout or sardines, at least two times a week, and for those at risk of coronary artery disease to talk to their doctor about supplements,” he said. “Our main concern here is the hyper-supplemented individual, who may be taking high-dose omega-3 supplements and eating four to five omega-3-enriched foods per day. This could potentially get someone to an excessive amount. As our paper indicates, there may be subgroups of those who may be at risk from consuming excess amounts of these fatty acids.”
Hord noted that there are currently no evidence-based standards for how much omega-3 is safe to consume, and that it isn’t possible to tell who could be at risk if they allow too much into their systems. He added that while they are not against taking fish oil supplements, that overdoing it could potentially be harmful to a person’s health.
“We need to establish clear biomarkers through clinical trials. This is necessary in order for us to know who is eating adequate amounts of these nutrients and who may be deficient or eating too much,” Hord said. “Until we establish valid biomarkers of omega-3 exposure, making good evidence-based dietary recommendations across potential dietary exposure ranges will not be possible.”

Cryptolocker Virus Holds Your Files For Ransom

Michael Harper for redOrbit.com – Your Universe Online
A new piece of ransomware is giving Internet users one more reason to think twice before they click a link in an email. A virus known as Cryptolocker has been infecting PCs around the world and effectively holding the files within for ransom. Users who have their files locked up by the ransomware are currently paying $300 to $700 to the criminals who run the virus to gain control of their computer.
So far, there have been no reports of the hackers reinfecting a machine once the ransom has been paid. However, those PC owners who do not pay the ransom fees may have their files lost forever, especially if they do not perform regular and offsite backups. PC owners can protect themselves by either being extremely cautious about which files they open in email, preventing certain applications from opening executable files, and backing up files as often as possible.
Cryptolocker first made an appearance last month and since then has been locking up individual computers as opposed to computers located on a network. The virus is usually spread via email through messages sent from an account claiming to be customer support for DHS, FedEx, UPS, etc. These emails have an executable file attached disguised as a PDF. The emails usually ask the recipient to download the form as a PDF, have it signed and then keep the form on file.

PC owners who click to download what they think is an important form end up downloading the cryptolocker virus. The virus then goes through the files on the computer and encrypts them using a method known as asymmetric encryption. This style of encryption requires a public and private key to unlock the files. While the private key may reside on the user’s PC, the private key resides on the cybercriminal’s server and is not handed over until payment is received, according to the tech blog Malwarebytes.

What’s more, the owner of cryptolocker insists the payment be made in 96 hours, or four days. If the ransom isn’t paid by this time, both the private and public keys are destroyed, essentially rendering the encrypted files useless and irretrievable.

According to BleepingComputer.com, there is no way short of a brute force method to decrypt the files without paying the ransom. The only way to restore the files without handing over $300 to $700 to a cybercriminal is to keep proper backups or Shadow Volume Copies of the files. PC owners can also use Software Restriction Policies to prevent certain software from opening or running executable files. This method can be used to directly target any email containing the cryptolocker virus, but it could also restrict other pieces of software from operating normally.
Finally, while paying the ransom so far has unlocked the encrypted files, users should be aware they are essentially paying a criminal for their crime.
“If even a few victims pay then the cybercriminals will think they have got a viable business model and keep infecting people and asking for ransoms. If nobody pays, they will stop these campaigns,” said Dmitri Bestuzhev, a Kaspersky spokesperson in an interview with The Guardian.

Ski Apps Not Up To Snuff For Mountain Emergencies, Says CAC

Peter Suciu for redOrbit.com – Your Universe Online

While there are plenty of apps that will help users track ski conditions, find the best powder, and even track your runs on the slopes, a mobile smartphone app probably isn’t the best way to call for help during mountainside emergency.

With many skiers now hitting the slopes with smartphones, there has been a burgeoning market for apps that promise skiers a call for help when an accident or disaster occurs. However, the Canadian Avalanche Centre (CAC) has warned skiers should not rely on apps as a means to reach rescuers following an avalanche.

The CAC last week called out three European-made apps including iSis Intelligent (Mountain) Rescue System, Snog Avalanche Buddy and SnoWhere. There three apps are available now for the iOS and/or Android devices.

According to the CAC, these are presented as economical alternatives to avalanche transceivers – the devices that are recommended for backcountry skiers, and which can transmit a skier’s location should an accident or avalanche occur. With most backcountry transceivers users can be tracked even if they’re buried under the snow by an avalanche.

The CAC has stated there are a number of issues with the technology in the smartphone apps, and the two main issues are the compatibility and the frequency range.

All avalanche transceivers are required to conform to the international standard of 457 kHz and, regardless of brand, can be used to search and find other transceivers. The app makers also apparently operate on proprietary systems.

“Not only are these new apps incapable of connecting with other avalanche transceivers, they are also incompatible between themselves, so one type of app can’t find another,” said the CAC Executive Director Gilles Valade via a statement.

In addition, there is the issue of the range that mobile smartphones can reasonably provide, especially in remote areas where getting a signal can be a major problem. This is why the 457 kHz standard remains the de facto standard for transceivers. It is able to transmit in remote areas, can transmit very well through dense snow, and is not deflected by objects such as trees and rocks. It is further noted for being very reliable and accurate.

“None of the various communication methods used by these apps come close to that standard,” added Valade. “WiFi and Bluetooth signals are significantly weakened when passing through snow, and easily deflected by the solid objects we expect to see in avalanche debris. And the accuracy of a GPS signal is nowhere near the precision required for finding an avalanche victim. ”

Other notable issues include the battery life of mobile devices, the ruggedness and reliability, as well as the outstanding issue of interference, something that is very common in the backcountry.

“These apps are being actively marketed as software that turns a smartphone into an avalanche transceiver but the CAC has serious concerns about their vulnerabilities,” says Valade. “We are warning all backcountry users to not use any of these apps in place of an avalanche transceiver.”

International Team Finds 11 New Alzheimer’s Genes

Brett Smith for redOrbit.com – Your Universe Online

In the biggest international study ever conducted on Alzheimer’s disease, the International Genomics Alzheimer’s Project (I-GAP) consortium has found eleven new regions of the genome involved in the onset of the neurodegenerative disorder.
The project involved searching the DNA of over 74,000 volunteers for new genetic risk factors linked to late-onset Alzheimer’s disease, the most frequently seen form of the disease. I-GAP, made up of four research consortia in the United States and Europe, made the discovery possible via thousands of DNA samples and shared datasets
“Collaboration among researchers is key to discerning the genetic factors contributing to the risk of developing Alzheimer’s disease,” said Dr. Richard J. Hodes, director of the National Institute on Aging (NIA) . “We are tremendously encouraged by the speed and scientific rigor with which IGAP and other genetic consortia are advancing our understanding.”
One of the biggest finds was the HLA-DRB5/DRB1 region, which plays a role in the body’s immune system and inflammatory response. This region of genome has been connected with multiple sclerosis (MS) and Parkinson’s disease, indicating that the diseases where irregular proteins build up in the brain may have a common system involved, and possibly a common target for treatment.
“The discovery of novel pathways is very encouraging considering the limited success of Alzheimer’s disease drugs tested so far,” said Dr. Margaret Pericak-Vance, Director of the Miami Institute of Human Genomics. “Our findings bring us closer toward identifying new drug targets for Alzheimer’s and other neurodegenerative diseases.”
The study also reported another 13 genetic variants that should be analyzed further.
“Interestingly, we found that several of these newly identified genes are implicated in a number of pathways,” said Gerard Schellenberg, from the University of Pennsylvania School of Medicine. “Alzheimer’s is a complex disorder, and more study is needed to determine the relative role each of these genetic factors may play. I look forward to our continued collaboration to find out more about these—and perhaps other—genes.”
“This study clearly demonstrates that there really is strength in numbers to identify genes that individually have a small effect on risk of Alzheimer’s,” said Lindsay A. Farrer, Chief of Biomedical Genetics at Boston University Medical Center.”But it’s not the magnitude of the odds ratio that’s really important.”
“Each gene we implicate in the disease process adds new knowledge to our understanding of disease mechanism and provides insight into developing new therapeutic approaches, and ultimately these approaches may be more effective in halting the disease since genes are expressed long before clinical symptoms appear and brain damage occurs,” he added.
“This landmark international effort has uncovered new pathways and new genes in old pathways that are definitely associated with Alzheimer dementia, but we need to do much work to better understand how exactly these genes work in health and disease, and to perhaps make drugs from these genes and molecules,” said Dr. Sudha Seshadri, professor of neurology at Boston University School of Medicine.
“We will continue to mine these results for new insights, even as we include more patients and use new technologies like whole genome sequencing to find more new pathways and genes.”

Eastern Steller Sea Lions Delisted As Endangered Species By NOAA

Lawrence LeBlond for redOrbit.com – Your Universe Online

Not since 1994 has the National Oceanic and Atmospheric Administration (NOAA) removed an animal from the Endangered Species Act (ESA) due to effective recovery efforts.

The eastern North Pacific gray whale was taken off the list of threatened and endangered species nearly 20 years ago after evidence was found that these marine mammals had recovered to near their estimated original population size and were no longer in danger of extinction throughout most of their range. A subsequent review in 1999 suggested the delisting status should continue.

Last week, the NOAA moved to delist from the ESA another marine mammal species, the eastern Steller sea lion, due to effective recovery efforts. NOAA Fisheries has determined that the eastern distinct population of this species has recovered enough to be removed from the listing.

“We’re delighted to see the recovery of the eastern population of Steller sea lions,” said Jim Balsiger, Administrator of NOAA Fisheries’ Alaska Region. “We’ll be working with the states and other partners to monitor this population to ensure its continued health.”

NOAA concluded that the delisting is warranted because the sea lions have met the recovery criteria set in a 2008 recovery plan and no longer meet the definition of threatened or endangered under the ESA guidelines.

A threatened species is one that is likely to become endangered within a given period throughout all or most of its range. An endangered species in one that is in danger of becoming extinct throughout all or most of its range.

Using a wealth of scientific information, it has been found that the eastern Steller sea lion has increased in population from 18,040 individuals in 1979 to about 70,174 in 2010, the most recent year for which NOAA has available data. While being delisted under the ESA, the mammals will still get protection under provisions set forth in the Marine Mammal Protection Act (MMPA).

NOAA had recommended the delisting earlier this year after action was sought from the states of Alaska, Washington and Oregon in June 2010. Commercial fishermen had also protested fishing regulations because of the endangered listing of the sea lion. The actual decline on the species was blamed on fishermen in the first place, as well as other boaters and people who would shoot the animals because they were a nuisance and killing fish.

Due to this, the animals were first listed as threatened under the ESA in 1990. In 1997, NOAA scientists recognized two distinct population segments of this species: a western and an eastern segment. The eastern segment includes animals from Cape Suckling, Alaska, south to California’s Channel Islands. The western segment remains classified as endangered and the NOAA is not proposing any changes to that population, which exists from Cape Suckling to the western Pacific Russian waters.

Although the NOAA Fisheries is no longer listing the eastern Steller sea lion with the ESA, the agency will continue to monitor the population to ensure that existing measures under the MMPA provide the protection necessary to maintain the recovered population. It is also proceeding carefully to ensure the eastern population segment remains strong.

Working with state and local agencies, NOAA has developed a 10-year plan to continue monitoring the sea lions, which is double what ESA guidelines call for. If this plan works as intended, the sea lion population should maintain its recovered status, according to Julie Speegle, NOAA Fisheries’ Juneau branch director.

The delisting will take effect 30 days after publication of the final rule in the Federal Register.

Nearly 90 Percent Of Children Treated For Bike Injuries Weren’t Wearing Helmets

redOrbit Staff & Wire Reports – Your Universe Online

Despite California state regulations mandating their use, only 11 percent of Los Angeles County children treated for bike-related injuries between 2006 and 2011 were wearing helmets, according to research presented Saturday at the American Academy of Pediatrics (AAP) National Conference and Exhibition.

Specifically, the analysis of 1,248 bicycle-related accidents found that children over the age of 12, low-income and minority children were less likely to wear the potentially life-saving headgear, study author Dr. Veronica F. Sullins of the University of California, Los Angeles (UCLA) and the Harbor-UCLA Medical Center said at the Orlando conference.

Dr. Sullins and her colleagues reviewed the records of pediatric patients involved in bike accidents from the Los Angeles County database from 2006 to 2011. The data reviewed included helmet use, age, gender, insurance status and race/ethnicity before checking to see if there was a link between helmet use and the need for emergency medical attention, morbidity, mortality and/or length of hospitalization, the AAP said in a statement.

The median age of the children was 13, and 64 percent of them were male, the researchers discovered. A total of 11.3 percent of patients wore helmets, including 35.2 percent of white children, seven percent of Asian children, six percent of black children, and four percent of Hispanic children.

In terms of insurance coverage, 15.2 percent of children with private insurance wore helmets at the time of injury, compared to 7.6 percent with public insurance. Children over age 12 were less likely to wear a helmet. A total of nine patients died, eight of whom were not wearing a helmet, while 5.9 percent of those injured required emergency surgery. A total of 34.1 percent of the children returned to their pre-injury capacity.

“Our study highlights the need to target minority groups, older children, and those with lower socioeconomic status when implementing bicycle safety programs in Los Angeles County,” Dr. Sullins said, noting that the study emphasizes the need to reinforce bicycle safety, especially in low-income and minority youngsters.

“Children and adolescents have the highest rate of unintentional injury and therefore should be a high priority target population for injury-prevention programs,” she added. Regional research, such as this one, can help identify at-risk populations in specific communities, allowing regions to use their safety-related resources more effectively.

According to the US Centers for Disease Control and Prevention (CDC), nearly 1,000 people die from injuries related to bicycle crashes each year, and another 550,000 men and women receive emergency care for these types of injuries annually. Furthermore, head injuries account for approximately 62 percent of bicycle-related deaths, 33-percent of bicycle-related emergency department visits, and 67-percent of bicycle-related hospital admissions.

Importance Of Caregivers Highlighted On World Stroke Day 2013

redOrbit Staff & Wire Reports – Your Universe Online

The role of caregivers in supporting those who have suffered a stroke, as well as myths and misconceptions surrounding the disease will be highlighted on Tuesday, October 29 as the global healthcare community marks World Stroke Day.

A stroke attacks someone, regardless of age or gender, every two seconds, the European Society of Cardiology (ESC) said in a statement Friday. Six million people die as a result of a stroke every year, and another nine million suffer from the condition, which occurs when a lack of oxygen suddenly kills brain cells.

Approximately 30 million people worldwide have experienced a stroke, and many of those have residual disabilities, the society added. Furthermore, a study published last week by the British medical journal The Lancet reported a 25 percent increase in the number of stroke cases among 20-to-64-year-olds worldwide. Furthermore, the percentage of people in this age group accounting for all stroke cases increased from 25 percent before 1990 to 31 percent currently.

The slogan for this year’s World Stroke Day – “Because I care…” – is the same as that featured in 2012. According to the World Stroke Organization, the phrase was selected because “it can easily be adapted to all cultures and in any setting… the slogan attempts to address prevailing misinformation about the disease, e.g., stroke only happens later in life.”

“Moreover, caregivers and the role of family and close friends – as those in the frontlines providing the supportive care – will play an essential role in the campaign,” the organization added. “The campaign will celebrate their important contributions. Care givers are the conduits between the stroke community and the general public in correcting misinformation as they know first-hand what the reality is around stroke.”

The American Stroke Association is using the occasion to try to educate the public about the warning signs of a person suffering from a stroke. The organization uses the acronym “F.A.S.T.” to describe what to look for: “Face Drooping, Arm Weakness, Speech Difficulty, Time To Call 911.” They will also be hosting a Google Hangout on Monday and an education event in Washington DC on Tuesday to mark the occasion.

The ESC is emphasizing the steps that young, obese women can take in order to reduce the risk that they will experience a stroke during their lifetimes. Research presented during ESC Congress 2013 demonstrated that young, overweight females who do not suffer from high blood pressure, high cholesterol or other metabolic disorders do not have an increased risk of stroke compared to normal weight women without metabolic disorders.

However, those who did suffer from metabolic disorders and were overweight or obese were 3.5 times more likely of having a stroke. As the authors of that study pointed out, obesity can increase the risk that a young woman will develop diabetes, high blood pressure or high cholesterol, which in turn increases the likelihood that they will suffer a stroke or heart attack. Those women can protect themselves from those conditions by losing weight, they said.

“Overall women get more strokes than men each year, mainly because stroke occurs more frequently at older ages and women generally live longer than men,” ESC spokesperson Professor Gregory Lip said in a statement. “Thus, approximately 55 000 more women than men have strokes each year. Awareness of important risk factors, such as atrial fibrillation (an irregularity of the heart rhythm) and high blood pressure, is crucial. Of note, women are twice more likely to die from a stroke than breast cancer each year.”

“Women are at the same risk of stroke as men, and the level of risk is completely steered by the underlying risk factor pattern they have. The majority of people who have a stroke are disabled for the rest of their lives and may be paralyzed or lose their ability to speak. The devastating consequences of this disease for patients and their loved ones make prevention even more important,” added ESC spokesperson Professor Joep Perk.

Space Weather Causes Airline Pilots, Passengers To Be Exposed To Radiation

[ Watch the Video: Pilots And Passengers Absorb Cosmic Rays And Radiation ]
redOrbit Staff & Wire Reports – Your Universe Online
Thanks to space weather, airline pilots absorb approximately as much radiation over the course of a year as a nuclear power plant employee, NASA officials revealed on Friday.
In fact, according to the US space agency, pilots are classified as “occupational radiation workers” by the Federal Aviation Administration (FAA) because they fly at heights where there is little atmosphere to protect them from cosmic rays and solar radiation.
For example, NASA officials said that during a typical polar flight from Chicago to Beijing, pilots are exposed to roughly as much radiation as if they had received a pair of chest x-rays. Over the course of their career, this can increase their risk of developing cataracts or even cancer – and passengers could also be similarly affected.
“A 100,000 mile frequent flyer gets about 20 chest x-rays,” no matter what the latitude of those flights are, explained Chris Mertens, a senior research scientist at NASA’s Langley Research Center. Of course, even non-flyers absorb some radiation from space weather, as cosmic rays and their by-products can reach Earth’s surface and expose people at sea level to levels equal to receiving one chest x-ray approximately every 10 days.
Flying on an airplane can increase the amount of radiation exposure 10-fold or more, NASA said. The exact amount of exposure depends on multiple factors, including the altitude of the plane, the latitude of the flight plan (polar routes expose passengers to more radiation), to solar activity and sunspot count, they added.
[ Watch the Video: Space Weather Impacts On Aviation ]
“To help airline companies safeguard passengers and personnel, NASA is developing an experimental tool to predict exposures in real time,” the space agency said. The project, which is being headed up by Mertens, has been dubbed NAIRAS or “Nowcast of Atmosphere Ionizing Radiation for Aviation Safety.”
According to Mertens, the number of flights that travel over the poles has increased drastically in recent years. Using polar routes during international flights are shorter and there are fewer head winds to deal with, he explained. As a result, these flights can save airlines up to $40,000 per flight in fuel costs.
However, as NASA officials point out, “Earth’s poles are where the radiation problem can be most severe. Our planet’s magnetic field funnels cosmic rays and solar energetic particles over the very same latitudes where airlines want to fly.  On a typical day when the sun is quiet, dose rates for international flights over the poles are 3 to 5 times higher than domestic flights closer to the equator.”
“If a flight controller wants to know the situation around the poles right now, NAIRAS can help,” they added. “It is, essentially, an online global map of radiation dose rates for different flight paths and altitudes.  Maps are produced in near real-time by a computer at Langley, which combines cutting-edge physics codes with realtime measurements of solar activity and cosmic rays.”
Currently, the project is in an experimental phase, Mertens said, but the goal is for NAIRAS to provide information comparable to land-based weather forecasts. In addition to the cost savings to airlines, the research team is hoping to help pilots better understand the radiation-related on-the-job hazards that they face.
A paper in which Mertens and this colleagues compare NAIRAS predictions with actual radiation measurements collected onboard airplanes will be published in the near future in the journal Space Weather. Mertens said that his team’s results “are encouraging, but we still have work to do.”

Is The Solar Maximum Peak Upon Us? Increase In Sun’s Activity May Tell

[ Watch the Video: Our Sun Is Flaring Up ]
Lawrence LeBlond for redOrbit.com – Your Universe Online
Our sun, the bright yellow disk that sits at the center of the Solar System providing us with light, energy and warmth, has a much darker side that also has the potential to disrupt, rather than preserve, life on Earth.
Experts know that the sun goes through a natural solar cycle about every 11 years, marked by an increase and decrease of sunspots – visible blemishes that appear on the photosphere (surface) of the sun. As sunspots increase in intensity, the solar cycle is said to be approaching a “solar maximum,” with a “solar minimum” occurring when fewer sunspots actively appear.
These sunspots, which have been noticeably increasing over the past few years, points to a solar maximum that could now be upon our doorstep. With sunspots, also come other activity, such as solar flares and coronal mass ejections (CMEs). A CME is a solar phenomenon that can send billions of tons of radioactive particles into space that, when pointed in the right direction, can affect spacecraft, satellites and electronic systems on Earth. However, the radioactive particles associated with CMEs cannot pass through Earth’s atmosphere to affect humans. Scientists tracking solar flares and CMEs note that they are most often associated with sunspots.
Such is the case this past week as a flurry of activity has been on the rise on our neighboring star, giving potential signs that we are now in the midst of a solar maximum peak.
[ Watch the Video: Path of October 22-23 Coronal Mass Ejections ]
On October 23, 2013, the sun produced an M9.4 solar flare that peaked at 8:30 p.m. EDT. This M-class flare was at the upper limit of a scale that goes from M1 to M9.9. These are generally considered the weakest flares that have the ability to cause some space weather disturbances near Earth. In the past, they have been known to cause brief radio blackouts at the Poles. Flares that exceed the M-class rating are known as X-class flares and can produce intense flares that are more disruptive when geared toward Earth.
Wednesday’s solar flare was also associated with a coronal mass ejection (CME). Experimental NASA research models showed that this CME left the sun at 9:48 p.m. EDT and may catch up with two other CMEs that left the sun on October 22, 2013. These CMEs were expected to reach Earth within one to three days and were not expected to pose a serious threat to electrical systems.
Less than two days later, on October 25, 2013, two more solar flares erupted from the same region of the sun. The first, classified as an X1.7 solar flare, peaked at 4:03 a.m. EDT; the second X2.1 class flare peaked at 11:03 a.m. EDT. In the past, solar flares of this intensity have caused radio blackouts on Earth for about an hour.
As the sun approaches the solar maximum, solar flares of such intensity have been quite common. Humans have been tracking this type of activity continuously since the solar cycle was discovered back in 1843. Experts note that during the sun’s peak activity, it is normal for so many flares a day to occur. In the current solar cycle, several X-class solar flares have erupted on the sun. The first was observed in February 2011 and the largest occurred on August 9, 2011, when the sun produced an X6.9 solar flare.
Several X-class flares were also observed throughout 2012, but the most notable event occurred earlier this year.
On May 12, the sun produced its first X-class solar flare of 2013 with an X1.7 event. While not the most significant of solar flares, what was notable was the fact that two more X-class flares occurred within 24 hours. After the first X1.7 flare peaked at 10 p.m. EDT on May 12, it was followed by an X2.8 and an X3.2 solar flare, both occurring during the morning of May 13, 2013.
As for this past week’s events, the NOAA’s Space Weather Prediction Center (SWPC) reported that a geomagnetic storm may occur on October 28, 2013 as a result of the CMEs produced by this week’s solar flares.
The SWPC said on Saturday morning, “The recent spectacular eruptions from Region 1882 belie the fact that the geomagnetic field is to be unaffected, at least for the next few days. Forecasters expect impacts from the first of the CMEs in about 72 hours, but things can change given the volatile nature of the three active centers on the solar disk. Possible G1 (Minor) Geomagnetic Storm levels are forecast. Updates here as conditions unfold.”
As the sun has been increasing its solar activity, leaving us to wonder if the solar maximum is really upon us, one event may actually confirm that the sun is definitely on the verge of something big.
In late September, NASA’s Solar Dynamics Observatory (SDO), which constantly monitors the sun for any and all activity in a variety of wavelengths, captured what truly looks like a “canyon of fire” across the surface of the sun (see image below).
On Sept. 29-30, a magnetic filament of solar material erupted on the sun, leaving a 200,000-mile-long trail appearing in the sun’s corona. The event, which looked like a fiery canyon ripping open within the sun, was actually plasma erupting in the atmosphere due to a magnetic disturbance. NASA’s Goddard Space Flight Center created a short video of the event, which lasted two days.
As the sun moves closer to its solar maximum peak, which has been predicted to be upon us, it is likely that solar flares and CMEs will increase in volume in the coming weeks and perhaps months. However, as has happened in past solar cycles, the experts could be off on their calculations and the solar maximum could still be far from peaking.

Images Below:

(LEFT): NASA’s Solar Dynamics Observatory or SDO, captured this image on the sun of an M9.4-class solar flare, which peaked at 8:30 pm EDT on Oct. 23, 2013. The image displays light in the wavelength of 131 Angstroms, which is good for viewing the intense heat of a solar flare and typically colored teal. Credit: NASA/SDO

(RIGHT): A magnetic filament of solar material erupted on the sun in late September, breaking the quiet conditions in a spectacular fashion. The 200,000 mile long filament ripped through the sun’s atmosphere, the corona, leaving behind what looks like a canyon of fire. Credit: NASA/SDO

Satellite Data Reveals Slight Shrinking Of Antarctic Ozone Hole

[ Watch The Video: Daily Ozone Hole for 2013 ]
redOrbit Staff & Wire Reports – Your Universe Online
The Antarctic ozone hole, a seasonal phenomenon that starts forming during the month of August was slightly smaller this year than it was on average over the past several decades, NASA satellite data has revealed.
The average size of the ozone hole in September-October 2013 was 8.1 million square miles (21 million square kilometers), the US space agency reported on Friday.
In comparison, the average size since the mid-1990s (when the annual maximum size of the hole stopped growing) is 8.7 million square miles (22.5 million square kilometers). However, scientists claim that the single-year size change is not enough data to determine whether or not the phenomenon has started to heal.
“There was a lot of Antarctic ozone depletion in 2013, but because of above average temperatures in the Antarctic lower stratosphere, the ozone hole was a bit below average compared to ozone holes observed since 1990,” Paul Newman, an atmospheric scientist and ozone expert at NASA’s Goddard Space Flight Center in Maryland, said in a statement.
The formation of the ozone hole begins during Antarctic spring each year, when the sun starts rising following several months of winter darkness. Cold air is trapped above the continent by polar-circling winds, and sunlight serves as a catalyst for reactions involving ice clouds and chlorine from manmade chemicals. Those reactions deplete the ozone until early December, when they wind down and allow the seasonal hole to eventually close.
“Levels of most ozone-depleting chemicals in the atmosphere have gradually declined as the result of the 1987 Montreal Protocol, an international treaty to protect the ozone layer by phasing out production of ozone-depleting chemicals,” NASA explained. “As a result, the size of the hole has stabilized, with variation from year to year driven by changing meteorological conditions.”
“The single-day maximum area this year was reached on Sept. 16 when the maximum area reached 9.3 million square miles (24 million square kilometers), about equal to the size of North America,” it added. “The largest single-day ozone hole since the mid-1990s was 11.5 million square miles (29.9 million square kilometers) on Sept. 9, 2000.”
NASA and National Oceanic and Atmospheric Administration (NOAA) scientists have been using a variety of ground, satellite-based and balloon-based instruments to monitor the ozone layer for approximately five decades. Each of those instruments collects data regarding different aspects of ozone depletion.
“The independent analyses ensure that the international community understands the trends in this critical part of Earth’s atmosphere. The resulting views of the ozone hole have differences in the computation of the size of the ozone hole, its depth, and record dates,” the US space agency said.
NASA’s 2013 observations of the ozone hole were obtained from data supplied by instruments on the agency’s Aura satellite, as well as the Ozone Monitoring and Profiler Suite instrument on the NASA-NOAA Suomi National Polar-orbiting Partnership satellite. Among the long-term satellite-based ozone-monitoring instruments used to monitor the hole are the Total Ozone Mapping Spectrometer, the second generation Solar Backscatter Ultraviolet Instrument, the Stratospheric Aerosol and Gas Experiment series of instruments, and the Microwave Limb Sounder.

Mozilla Lightbeam Digs Deep Into The Cookie Jar

Enid Burns for redOrbit.com – Your Universe Online
Mozilla has its hand in the cookie jar, and wants to share the contents. The developer of the open source Firefox browser released Lightbeam for Firefox, an add-on that lets you track tracking cookies.
The new browser add-on will show users what companies are behind each cookie stored in their browsers, and what information those companies are gathering. Lightbeam was introduced on the Mozilla Blog by Alex Fowler who leads privacy and public policy for Mozilla.
Tracking cookies are hotly debated files stored within a browser that track browsing history on a single site or network of websites. Such cookies are often deployed by ad networks, and are able to identify a site visitor based on the cookie, and take note of the frequency and content of visits across a network. That information is often used to sell ads, and also to serve targeted ads to site visitors. Retailers also use cookies to store login information and track what products a consumer looks at and purchases.
Lightbeam aims to show users just who is behind a cookie. An ad network might place a cookie, but it has a publisher website viewing information, as well as specific advertisers. In a video introducing the new add-on Mozilla’s Fowler explains that he checks four websites each morning, and interacts with over 120 companies during the experience.
Lightbeam is the follow-up to Collusion, an add-on that Mozilla introduced a year ago that tackles privacy issues for internet users.
The add-on is accessible in the browser. It creates a snapshot that helps users see what companies are behind each cookie, and what information those companies are interested in tracking. The snapshot appears in the form of a meter or tag cloud to offer up the data in a visual manner that web users can understand.
Mozilla hopes that users will get a better idea of how first and third party sites are connected to each other. Users are also encouraged to contribute data to the Lightbeam database. “Call it a Wizard of Oz moment for the web, where users collectively provide a way to pull back the curtain see its inner-workings,” Fowler wrote in the blog post.
The app also offers support for publishers. Mozilla worked with a number of online publishers during the development of Lightbeam for Firefox. The goal for collaborating with publishers was to determine the value of crowdsourced data.
The cookie is a sore subject in the online and advertising industries. Internet security software often identifies cookies and alarms users. There are many users who delete cookies or opt for anonymous browsing. Others allow for all cookies, even ones that are too pervasive in the data they collect.
While Lightbeam provides users with information on cookies and gives them the choice to block certain cookies, Mozilla has taken more drastic measures recently. Earlier this year the non-profit developer said it would automatically block certain sites that use cookies in a version of Firefox the organization was developing. The action got a reaction from the ad industry, which relies on cookies to sell advertising.
The ongoing war on cookies has led some companies to seek alternatives. A recent report identified websites that use device fingerprinting to track users on computers and also mobile devices such as tablets and smartphones. Reports from earlier this month found that Microsoft is also developing an alternative to cookies, the details of such a solution are still forthcoming.

FDA Recommends Putting Hydrocodone In Same Class As Morphine

Brett Smith for redOrbit.com – Your Universe Online

On Thursday, the Food and Drug Administration (FDA) suggested tighter controls on how doctors prescribe common, widely-use narcotic painkillers containing the narcotic hydrocodone, making them controlled as strictly as powerful painkillers such as OxyContin.

In announcing its recommendations, the FDA said it had become “increasingly concerned about the abuse and misuse of opioid products, which have sadly reached epidemic proportions in certain parts of the United States.”

Janet Woodcock, who heads the FDA’s center for drug evaluation and research, said that the agency expects to submit its formal recommendation later this year that reclassify painkillers containing hydrocodone as “Schedule II” medications, an upgrade from their current “Schedule III” classification.

“We are announcing the agency’s intent to recommend to HHS (Health and Human Services) that hydrocodone combination products should be reclassified to a different and more restrictive schedule,” she said in a statement.

“This determination comes after a thorough and careful analysis of extensive scientific literature, review of hundreds of public comments on the issue and several public meetings, during which we received input from a wide range of stakeholders, including patients, health care providers, outside experts and other government entities.”

The change would come after the Drug Enforcement Administration (DEA) requested just such a move in 2009. According to federal data, doctors wrote over 130 million prescriptions for hydrocodone-containing drugs for about 47 million patients in 2011.

Created in a collaboration between the FDA and the DEA, the scheduling system ranks drugs according to their medical use, potential for abuse and international agreements, among other factors.

If hydrocodone-containing drugs were to become Schedule II substances, they would officially be considered having one of the highest potentials for abuse and addiction of all legally prescribed medications. Other Schedule II substances include morphine, the attention-deficit hyperactivity disorder (ADHD) medications Adderall and Ritalin and cocaine – when used as a topical anesthetic to treat cancer.

Prescription drugs are responsible for about 75 percent of all drug overdose deaths in the US. According to federal statistics, the number of deaths from prescription narcotic painkillers has quadrupled since 1999. With drugs containing hydrocodone representing about 70 percent of all opioid prescriptions, and their current status as a Schedule III substance, abuse has skyrocketed, experts have said.

In an interview with The New York Times, Woodcock said FDA officials have considered how the new rules might affect patients. However, she said that the effect on public health caused by the abuse of these drugs has created a watershed moment.

“These are very difficult trade-offs that our society has to make,” she said. “The reason we approve these drugs is for people in pain. But we can’t ignore the epidemic on the other side.”

The new regulations would cut the supply of the drug a person receives without a new prescription in half to 90 days. Under current regulation, a patient can refill a prescription for hydrocodone-containing drugs five times over a six-month period before needing a new prescription. Previous research has found that most patients use such medications for only 14 days.

Mercury Hot Spot Risks Reduced With New Low-cost, Nondestructive Technology

Smithsonian
Hot spots of mercury pollution in aquatic sediments and soils can contaminate local food webs and threaten ecosystems, but cleaning them up can be expensive and destructive. Researchers from the Smithsonian Environmental Research Center and University of Maryland, Baltimore County have found a new low-cost, nonhazardous way to reduce the risk of exposure: using charcoal to trap it in the soil.
Mercury-contaminated “Superfund sites” contain some of the highest levels of mercury pollution in the U.S., a legacy of the many industrial uses of liquid mercury. But despite the threat, there are few available technologies to decrease the risk, short of digging up the sediments and burying them in landfills—an expensive process that can cause significant ecological damage.
In a new study published in the journal Environmental Science & Technology, Cynthia Gilmour (SERC), Upal Ghosh (UMBC) and their colleagues show that adding activated carbon, a form of charcoal processed to increase its ability to bind chemicals, can significantly reduce mercury exposure in these highly contaminated sites. With funding and support from several industry and federal partners, the team tested the technology in the laboratory with mercury-contaminated sediments from four locations: a river, a freshwater lake and two brackish creeks. To reduce the harm from mercury, the sorbents also had to decrease the amount of methylmercury taken up by worms.
“Methylmercury is more toxic and more easily passed up food webs than inorganic mercury,” said Gilmour, the lead author on the study. “Unfortunately, methylmercury is produced from mercury contamination by natural bacteria. To make contaminated sites safe again, we need to reduce the amount of methylmercury that gets into animals.”
Added at only 5 percent of the mass of surface sediments, activated carbon reduced methylmercury uptake by sediment-dwelling worms by up to 90 percent. “This technology provides a new approach for remediation of mercury-contaminated soils—one that minimizes damage to contaminated ecosystems, and may significantly reduce costs relative to digging or dredging,” said Ghosh, co-author on the study. Activated carbon can be spread on the surface of a contaminated sediment or soil, without physical disturbance, and left in place to mix into the sediment surface. Called “in-situ remediation,” the use of sorbents like activated carbon has been proven to reduce the uptake of several other toxic pollutants. However, this is the first time activated carbon had been tested for mercury-contaminated soils.
The research group is now testing its effectiveness in the field at several Superfund sites across the country. If successful in the field, this approach of treating soil with activated carbon may be able to reduce the risk of mercury exposure in polluted sites and subsequent contamination of food webs.

On The Net:

Scientists Find Rare Gene Mutation For Slow Metabolism And Overeating

[ Watch the Video: KSR2 Mutations are Associated With Obesity ]

Michael Harper for redOrbit.com – Your Universe Online

Researchers from the University of Cambridge say they’ve discovered a possible genetic origin for obesity which could slow the metabolism and drive a person to eat fatty foods. Though they say this genetic condition affects less than one in one hundred people, they were able to observe similar conditions in mice with the same genetic switches flipped.

Those who are affected by faulty genetics, however, are likely to be severely obese by the time they reach early childhood. Dr. I. Sadaf Farooqi of Cambridge University and team say the main suspect in their research is the gene KSR2 (Kinase Suppressor of Ras 2). When this gene is flipped off, it may trigger longer bouts of eating and slower metabolism, thereby leading to obesity. For study, Farooqi and team sequenced the DNA from over 2,000 severely obese patients, finding several variations of a KSR2 mutation. Their results are now published in the journal Cell.

“You would be hungry and wanting to eat a lot, you would not want to move because of a slower metabolism and would probably also develop type 2 diabetes at a young age,” explained Farooqi in an interview with the BBC. “It slows the ability to burn calories and that’s important as it’s a new explanation for obesity.”

Farooqi is quick to admit, however, that their studies show this mutation is rare and likely only affects young children who already show signs of obesity. All told, the research suggests that less than one percent of the population is affected by the gene, but two percent of young children who are already obese likely have a KSR2 mutation.

Doctors and researchers are often quick to dismiss the idea that being born with a slow metabolism could be to blame for severe cases of obesity. Those who do have a slower metabolism are usually found to have an underactive thyroid gland. Yet when Farooqi and team began sequencing the DNA from 2,101 severely obese patients, they found their thyroid glands were behaving normally. It’s also been suggested that some people simply burn calories more slowly than others, a hypothesis which may be supported by Farooqi’s research.

“Up until now, the genes we have identified that control body weight have largely affected appetite. However, KSR2 is different in that it also plays a role in regulating how energy is used in the body. In the future, modulation of KSR2 may represent a useful therapeutic strategy for obesity and type 2 diabetes.”

With this new study, Farooqi says there may be good cause to look into creating a drug to flip the KSR2 switch back into normal operating mode to prevent a lifetime of obesity and health problems in children.

This isn’t the first time medical science has sought a drug to rid the developed world of obesity, however. Previously researchers claimed they found a hormone responsible for giving some people to have larger appetites than others. The biochemical is technically known as ghrelin but is commonly referred to as the ‘hunger hormone.’ It had been assumed that if a drug could repress this hormone, it could help those who struggle with their weight to stop overeating.

Recent research has shown that while ghrelin may be responsible for triggering hunger, it’s also know to be triggered during long bouts of stress. This means drugs created to inhibit ghrelin may also one day be used to help those with post-traumatic stress disorder (PTSD).

HIV’s Secret Hideout Frustrates Efforts To Develop A Cure

[ Watch The Video: Barrier To HIV Cure Bigger Than Previously Thought ]

redOrbit Staff & Wire Reports – Your Universe Online

Although current treatments for human immunodeficiency virus (HIV) can keep the disease at bay, a larger-than-expected amount of hidden virus may complicate efforts to find a cure, according to the most detailed and comprehensive analysis to date of the latent reservoir of HIV proviruses.

The three-year study, published Thursday in the journal Cell, deals a painful blow to researchers working hard to find a cure for HIV/AIDS, a disease that kills nearly two million people per year according to the World Health Organization (WHO).

Infectious disease experts at John Hopkins found that the amount of potentially active, dormant forms of HIV hiding in infected immune T cells may actually be 60 times greater than previously thought.

This hidden HIV is part of the so-called latent reservoir of functional proviruses that remains long after antiretroviral drug therapy has successfully brought viral replication to a standstill. If antiretroviral therapy is stopped or interrupted, some proviruses can reactivate, allowing HIV to make copies of itself and resume infection of other immune cells, the researchers said.

Senior study investigator Robert Siliciano, M.D., Ph.D., who in 1995 first showed that reservoirs of dormant HIV were present in immune cells, said that while the current study’s results show most proviruses in the latent reservoir are defective, curing the disease will depend on finding a way to target all proviruses with the potential to restart the infection.

“These results indicate an increased barrier to cure, as all intact noninduced proviruses need to be eradicated,” Siliciano said. “Although cure of HIV infection may be achievable in special situations, the elimination of the latent reservoir is a major problem, and it is unclear how long it will take to find a way to do this.”

The study’s results showed that among 213 HIV proviruses that were isolated from the reservoirs of eight patients and that were initially unresponsive to highly potent biological stimuli, some 12 percent could later still become active and capable of replicating their genetic material and transmitting infection to other cells. All of these non-induced proviruses had previously been thought to be defective, with no possible role in resumption of the disease, said Siliciano, a professor at the Johns Hopkins University School of Medicine and a Howard Hughes Medical Institute investigator.

These disappointing findings pose a serious problem to prevailing hopes for the so-called “shock and kill” approach to curing HIV, he said. That approach refers to forcing dormant proviruses to “turn back on,” making them “visible” and vulnerable to the immune system’s cytolytic “killer” T cells, and then eliminating infected cells from the body while antiretroviral drugs prevent any new cells from becoming infected.

Siliciano said this new discovery could enhance support for alternative approaches to a cure, including renewed efforts to develop a therapeutic vaccine to stimulate immune system cells that attack and kill all HIV.

Lead study investigator and Johns Hopkins postdoctoral fellow Ya-Chi Ho, M.D., Ph.D., said the team’s investigation of “the true size” of the latent reservoir was prompted by a large discrepancy between the two established techniques for measuring how much provirus is in immune system cells.

The team’s original method of calculating only reactivated proviruses yielded numbers that were 300-fold lower than a DNA-based technique used to gauge how many total proviral copies, both dormant and reactivated, are present.

“If medical researchers are ever going to lure out and reactivate latent HIV, then we need to better understand exactly how much of it is really there,” Ho said.

In the current study, the researchers sequenced, or spelled out, the entire genetic code of HIV proviruses that reactivated and those that could not be induced to do so. Twenty-five of the 213 non-induced isolates, when sequenced, had fully intact genomes when compared to those that did reactivate. Analysis of the remaining 88 percent of non-induced proviruses showed that all were defective, possessing genetic deletions and mutations that would forestall viral replication.

Additional experiments on the cloned proviruses showed that the intact, non-induced proviruses could be reconstructed to produce active virus, which in turn could replicate in human immune cells.

The researchers also found that cloned proviral DNA lacked a latency-inducing chemical methyl group. When researchers looked at where non-induced proviral DNA showed up in infected human immune cells, they found some 92 percent of the non-induced proviral DNA was located in actively transcribed regions of the human cell DNA. This finding suggests that non-induced proviral DNA is not permanently hidden in some inaccessible regions of the host chromosomes, but instead lies in regions where it could become reactivated, the researchers said.

Statistical modeling later revealed that these figures equated to a 60-fold increase in the potential size of the latent reservoir when compared to the team’s original method for counting only reactivated viruses.

Additional experiments showed that repeated chemical stimuli could reactivate proviruses that failed to respond to initial attempts at reactivation.

Ho says the study results, although disheartening, will galvanize experts to refine and improve methods for detecting proviruses capable of reactivation. Siliciano is currently working to organize the “What Will it Take to Achieve an AIDS-free World?” conference, which will take place November 3-5 in San Francisco.

Bee Sting Study Suggests That Allergic Reactions Could Be Defense Mechanisms

[ Watch the Video: The Evolution Of An Allergy ]

redOrbit Staff & Wire Reports – Your Universe Online

While bee stings could be fatal to those who are allergic to the insect’s venom, that type of adverse biological reaction is actually the body’s attempt to protect a person from potentially deadly toxins.

Writing in Wednesday’s edition of the journal Immunity, scientists from the Stanford University School of Medicine and their colleagues describe how they injected mice with a small dose of bee venom, discovering that later the rodents were able to resist a much larger dose of the toxin.

“Innate immune responses occur in subjects exposed to a foreign substance, such as a pathogen or a toxic material like venom, for the first time,” officials from the California-based institution said in a statement. “Immune cells called mast cells, which reside in most of the body’s tissues, are poised to unleash signals that turn on defense responses when a pathogen or toxin intrudes.”

They claim it is the first experimental evidence to suggest that the same immune response associated with allergic reactions could also have evolved in order to help protect an animal from poisons. Previous research conducted by the same scientists found that these mast cells produce enzymes capable of detoxifying snake venom, and that they can also enhance an individual’s natural resistance to bee venom.

Prior immunization is not required for these types of innate immune responses, nor is the development of specific antibodies. In contrast, during an adaptive immune response, the immune system creates antibodies that can recognize the toxin or pathogen that is invading the animal’s system. Adaptive immunity is typically a faster, more specified and more effective form of defense than innate immunity, the research team explained.

“Our study adds to the argument that allergy evolved to protect us from noxious factors in the environment – it protects us by making us sneeze, cough, vomit, and itch, by inducing a runny nose and tears,” said Ruslan Medzhitov of Yale University School of Medicine, one of the researchers involved in the study. “All of these reactions are designed to expel something harmful from the body. They are unpleasant, but they protect by being unpleasant.”

“Everyone who ever witnessed or even experienced an anaphylactic reaction to a bee or a wasp sting will wonder why evolution did not get rid of such a potentially deadly immune reaction,” added Martin Metz of Charité-Universitätsmedizin Berlin. “We have now shown in mice that the development of IgE antibodies to honeybee venom and also to the venom from a poisonous snake can protect mice to some degree from the toxic effects of the venoms.”

IgE antibodies, which are a type of immunoglobulin made by the body which are involved in allergic reactions, bind to the surface of mast cells and cause them to initiate an adaptive immune response when exposed to toxins, co-lead author and postdoctoral researcher Thomas Marichal said.

Despite assumptions that most IgE-related responses are negative in nature, the research team speculated that they might be positive in some situations because otherwise, they would have been eliminated through evolution. Marichal’s team hypothesized that IgE could be essential for protection against lethal stings, and that allergic reactions were “an extreme, and maladaptive, example of this type of defense.”

In order to test their hypothesis, the scientists began by injecting mice with a low dose of venom equal to just one or two bee stings. The mice began to develop more venom-specific immune cells, as well as higher IgE antibody levels, in comparison to rodents that had been injected with just a salt solution.

Three weeks later, both groups of mice were injected with a potentially fatal dose of venom equal to five bee stings. According to the researchers, the mice that had been immunized experienced less hypothermia and were three times more likely to survive than the control group. They were also less likely to experience the anaphylaxis characteristic of severe allergic reactions, the study authors said.

“To determine whether IgE antibodies were required for this protection, the team tested mice with three types of mutations: mice without IgE, mice without functional IgE receptors on their mast cells, and mice without mast cells,” Stanford said. “In all three groups of mutant mice, pre-immunization with a low dose of bee venom did not confer protection against a lethal dose, suggesting that the protection depends on IgE signaling and mast cell activation.”

Shifting Jellyfish Numbers Linked To Climate Fluctuations

Brett Smith for redOrbit.com – Your Universe Online

Every month, 8 to 12 days after each full moon, Waikiki Beach is invaded by large numbers of box jellyfish. After witnessing the phenomenon countless times, Honolulu lifeguard Landy Blair, in conjunction with researchers at the University of Hawaii at Mānoa, started tracking the numbers of jellyfish that invade the beach with each cycle.

Over 170 full moons after they began, the team’s published findings in the latest issue of PLOS ONE reveal the cycles of jellyfish behavior and how they relate to deep climactic conditions.

“Although there have been long-term studies of jellyfish abundance and climate in recent years, none have looked at box jellyfish species,” said Luciano Chiaverano, a member of the UH Mānoa study team. “This is quite surprising, as box jellyfish are among the most venomous animals in the world. Often their habitat overlaps with human recreation, resulting in painful, sometimes even lethal, stings and causing beach closures at various locations around the world.”

“Our box jellyfish collection data is the longest continual time-series census of a cubozoan species in the world, and provides a rich data set to analyze and assess physical and biological oceanographic correlations” said Angel Yanagihara, a neurobiologist at the University of Hawaii.

The study essentially confirms Blair’s early observations: box jellyfish arrive in Waikiki with a regular, predictable timing based on the lunar cycle. However, the sizes of the monthly influx varied substantially with no predictable seasonality. In a one section of the beach, an average of nearly 400 jellyfish arrived each lunar month, with numbers ranging from 5 to almost 2,400 individuals per event.

While the total number of jellyfish coming to Waikiki did not vary during the past 14 years, it did follow an increasing-and-decreasing oscillation pattern with each phase lasting approximately four years. According to study researchers, climate fluctuations played a major role in food availability, ultimately affecting the numbers of invading jellyfish.

Study researchers Brenden Holland and Jerry Crow analyzed three major climatic indexes, 13 oceanographic variables and seven local weather parameters indexes in an attempt to make sense of the cyclical jellyfish pattern. Although they did not see a noteworthy relationship between jellyfish counts and weather parameters, the counts showed a sturdy, positive relationship with a decadal-long climatic measurement specific to the sub-tropical Pacific known as the North Pacific Gyre Oscillation index (NPGO, as well as primary production and abundance of small zooplankton.

The authors realized that the number of box jellyfish arriving at the beach is probably determined by bottom-up processes: the higher the NPGO value, the higher the transport of nutrient-richer waters from the northern Pacific into the waters around Hawaii. This boost may push regional primary production, resulting in higher zooplankton levels for jellyfish to feed on.

“Jellyfish are known to have increased growth rates and reach larger adult sizes in response to increased food availability, and because body size positively correlates with fecundity in jellyfish, more eggs and more larvae are produced when food is readily available,” Chiaverano said.

Predicting changes in jellyfish populations over time is difficult due to sampling issues, the lack of historical records, and the bizarre characteristics of the jellyfish life cycle. Some research has indicated that jellyfish populations are affected by large-scale climate changes and regional environmental conditions.

Movie Of Sun’s Canyon Of Fire Released By NASA

[ Watch The Video: Canyon of Fire on the Sun ]
NASA
A magnetic filament of solar material erupted on the sun in late September, breaking the quiet conditions in a spectacular fashion. The 200,000 mile long filament ripped through the sun’s atmosphere, the corona, leaving behind what looks like a canyon of fire. The glowing canyon traces the channel where magnetic fields held the filament aloft before the explosion.  Visualizers at NASA’s Goddard Space Flight Center in Greenbelt, Md. combined two days of satellite data to create a short movie of this gigantic event on the sun.
In reality, the sun is not made of fire, but of something called plasma: particles so hot that their electrons have boiled off, creating a charged gas that is interwoven with magnetic fields.
These images were captured on Sept. 29-30, 2013, by NASA’s Solar Dynamics Observatory, or SDO, which constantly observes the sun in a variety of wavelengths.
Different wavelengths help capture different aspect of events in the corona. The red images shown in the movie help highlight plasma at temperatures of 90,000° F and are good for observing filaments as they form and erupt. The yellow images, showing temperatures at 1,000,000° F, are useful for observing material coursing along the sun’s magnetic field lines, seen in the movie as an arcade of loops across the area of the eruption. The browner images at the beginning of the movie show material at temperatures of 1,800,000° F, and it is here where the canyon of fire imagery is most obvious. By comparing this with the other colors, one sees that the two swirling ribbons moving farther away from each other are, in fact, the footprints of the giant magnetic field loops, which are growing and expanding as the filament pulls them upward.

On The Net:

Facing Adversity In Early Life And Its Connection To Fibromyalgia

Facing Adversity In Early Life And Its Connection To Fibromyalgia

Studies done on the impact of early life events have become to really blossom over the past few years.

Measuring how early childhood helps to shape a person’s long-term physiology and their behavior has been constantly under review.

The fibromyalgia disorder can be explained as a disorder that increases the pain sensitivity of those it affects in addition to a number of other co-morbidities. The root cause or beginning of the disorder is still unknown in the world of medicine.

The causes or factors that may affect the development of fibromyalgia have not been verifiably identified. Many theories exist, but none have the necessary amount of evidence to be proven as undeniable fact. The disorder currently affects over two percent of the population and is more commonly found in women that it is men.

The symptoms often associated with fibromyalgia include muscular stiffness, chronic fatigue, mood disturbances, sleeping disorder such as insomnia, and tenderness of specific locations on the body. There a several risk factors that have been documented by pathophysiologies that may lead to the development of fibromyalgia later in life.

Risk Factors

One of the risk factors that has been under quite a bit of examination is painful experiences that are felt during the infancy stage. These experiences of pain could cause long-lasting alteration in the processing of pain by the brain. Infants that are born with illnesses or under tremendous birthing stresses, such as being born prematurely or in need of immediate surgery, may have to be hospitalized for their treatments.

In the clinical stages of neonatal intensive care, newborns can be subject to many painful procedures that have the possibility of being performed daily with routine monitoring.

Additionally, during these stages, newborns that need to be hospitalized could face up to fourteen very painful procedures per day. One recent study showed the these newborns faced a daily average of sixteen painful or stressful procedure per day and that close to 80 percent of those newborn were not given the proper analgesia for these types of procedures.

Due to the high frequency of these procedures that could last months, the children are affected negatively and may see significant changes in their brains processing of pain. These abnormalities could possibly explain the abnormal processing of pain in patients with fibromyalgia.

Physiological disturbances can be prevented if the necessary pain management techniques are afforded to neonatal patients. Because neonates cannot afford the doctor tending to them their pain levels or what they are experiencing, pain management can be quite difficult to accomplish.

The treatments that are available today include morphine and some benzodiazepines that can be used for pain experienced after surgery and for general sedation for neonates. There are some nonpharmacological techniques that could be administered to neonates including sucrose.

However, the effect of these techniques on long-term chronic exposure to them has not been studies in very much depth. It can still be argued that adequate amounts of pain management for neonates may be able to reduce various types of pain syndromes, including fibromyalgia, later in childhood and adult life.

Maternal Deprivation

There are some detailed tests and experiments that highlight maternal deprivation as a major risk factor the future development of fibromyalgia. The child and the quality of his or her relationship with the primary caregiver, often the mother, will dictate the emotional reactivity of the child throughout their life and the type of attachments that will be created with others.

A secure bond between mother and child has been proven to be very beneficial to the development of an infant by many studies. Disordered attachment styles, studies suggest, have a consistent link between chronic pain and issues of coping with chronic pain. In some studies, chronic pain patients that have rather high levels of what is called avoidant attachment tended to score pain intensity more highly.

Patients with fearful attachments styles tended to showcase higher levels of pain unbearable in some cases. Adults with secure attachment styles taking acute pain tests tended to rate their pain levels as “less intense.” Secure attachment styles’ formation has been theorized to be directly linked to the opioidergic and dopamine system, a link that suggests that when a parent and infant are securely attached, the child could be protected against the development of fibromyalgia when they get older.

Physical and Pschological Trauma in Early Childhood

Many studies have claimed that children that were subjected to physical and sexual abuse were exposed to the development of fibromyalgia. These forms of early life abuse, and early life abuse in general, often will burden those individuals that suffer through them with a number of behavioral problems and types of pathological issues. Issues such as heavy depression, post-traumatic stress disorder, substance abuse, alcoholism, obesity, and even suicide, are common results associated with early life abuse.

It could be by coincidence, but these same problems are found and often attributed to fibromyalgia. There have been studies that have found a link between the impacts of early abuse in childhood and other early traumas and their contribution to fibromyalgia. These links can be found in the disruption of neurotransmitter systems that will impact how well someone is able to manage stress.

When considering these connections between early childhood traumas, abuses, neonatal complications, and the bonds shared between a child and his or her caretaker, and fibromyalgia, there is much to be discussed. While the evidence has not been mounted to a point where any of these theories can be proven as facts, the studies and the connections drawn in the studies can serve as the basis or starting points for several other studies and tests to be done in the future.

The search for the primary factors in the development of the fibromyalgia disorder is constantly pressing forward. Researchers are working tirelessly to find answers to the questions those individuals suffering from fibromyalgia are asking. Once the causes of fibromyalgia have been identified, then researchers will possibly be able to develop a cure the disorder.

Early-Life Exposure Of Frogs To Herbicide Increases Mortality From Fungal Disease

The herbicide atrazine increased mortality from chytridiomycosis, a disease causing worldwide amphibian declines

The combination of the herbicide atrazine and a fungal disease is particularly deadly to frogs, shows new research from a University of South Florida laboratory, which has been investigating the global demise of amphibian populations.

USF Biologist Jason Rohr said the new findings show that early-life exposure to atrazine increases frog mortality but only when the frogs were challenged with a chytrid fungus, a pathogen implicated in worldwide amphibian declines. The research is published in the new edition of Proceedings of the Royal Society B.

“Understanding how stressors cause enduring health effects is important because these stressors might then be avoided or mitigated during formative developmental stages to prevent lasting increases in disease susceptibility,” Rohr said.

The study was conducted by Rohr and Lynn Martin, Associate Professors of USF’s Department of Integrative Biology; USF researchers Taegan McMahon and Neal Halstead; and colleagues at the University of Florida, Oakland University, and Archbold Biological Station.

Their experiments showed that a six-day exposure to environmentally relevant concentrations of atrazine, one of the most common herbicides in the world, increased frog mortality 46 days after the atrazine exposure, but only when frogs were challenged with the chytrid fungus. This increase in mortality was driven by a reduction in the frogs’ tolerance of the infection.

Moreover, the researchers found no evidence of recovery from the atrazine exposure and the atrazine-induced increase in disease susceptibility was independent of when the atrazine exposure occurred during tadpole development.

“These findings are important because they suggest that amphibians might need to be exposed only to atrazine briefly as larvae for atrazine to cause persistent increases in their risk of chytri-induced mortality,” Rohr said. “Our findings suggest that reducing early-life exposure of amphibians to atrazine could reduce lasting increases in the risk of mortality from a disease associated with worldwide amphibian declines.”

Until this study, scientists knew little about how early-life exposure to stressors affected the risk of infectious diseases for amphibians later in life.

”Identifying which, when, and how stressors cause enduring effects on disease risk could facilitate disease prevention in wildlife and humans, an approach that is often more cost-effective and efficient than reactive medicine,” Rohr said.

The findings are also the latest chapter in research Rohr and his lab has conducted on the impact of atrazine on amphibians. These findings are consistent with earlier studies that concluded that, while the chemical typically does not directly kill amphibians and fish, there is consistent scientific evidence that it negatively impacts their biology by affecting their growth and immune and endocrine systems.

On the Net:

Children Who Are Taught An Art May Lead Future Of Innovation

[ Watch the Video: Your Little Picasso Could Become The Next Edison ]

Brett Smith for redOrbit.com – Your Universe Online

While politicians and policy experts seem to be pushing for more students to embrace math and science at the expense of the arts, a new study from Michigan State University (MSU) supports the notion that an artistic education has so-called ‘real world’ value.

According to the study, which was published in the journal Economic Development Quarterly, people who participated in arts activities as a child were more likely to generate patents and launch businesses as adults.

The interdisciplinary team of study authors reached their findings by tracking MSU Honors College graduates from 1990 to 1995, who majored in science, technology, engineering or mathematics (STEM). The research team discovered that those who own businesses or patents in their study cohort received up to eight times more experience with the arts as children than the average person.

“The most interesting finding was the importance of sustained participation in those activities,” said Rex LaMore, director of MSU’s Center for Community and Economic Development. “If you started as a young child and continued in your adult years, you’re more likely to be an inventor as measured by the number of patents generated, businesses formed or articles published. And that was something we were surprised to discover.”

Music lessons seems to be particularly prevalent among the group, as the researchers found 93 percent of the STEM graduates had some kind of musical training at some point in their lives. According to the National Endowment for the Arts, only 34 percent of adults have ever received musical training.

The MSU cohort also reported higher-than-average participation in the visual arts, such as acting, dance and creative writing. Those who had experience with metal work and electronics during childhood had a 42 percent greater chance to own a patent than those without that type of experience. Those with architectural experience were nearly 88 percent more likely to form a company and children with exposure to photography were 30 percent more likely to have a patent filed.

According to study researchers, an artistic background sets the stage for non-conventional thinking. In fact, the study cohort said they had used ‘artistic’ skills – such as analogies and imagination – to solve problems in their chosen field.

“The skills you learn from taking things apart and putting them back together translate into how you look at a product and how it can be improved,” said Eileen Roraback, of MSU’s Center for Integrative Studies in the Arts and Humanities (CISAH). “And there’s creative writing. In our study, a biologist working in the cancer field, who created a business, said her writing skills helped her to write business plans and win competitions.”

The researchers went on to say that their findings could be useful in the context of future education policy as the worldwide economy undergoes an innovation-based revolution.

“Inventors are more likely to create high-growth, high-paying jobs in our state, and that’s the kind of target we think we should be looking for,” LaMore said. “So we better think about how we support artistic capacity, as well as science and math activity, so that we have these outcomes.”

First Ever Baby Cured Of HIV Still In Remission 18 Months Later

Lawrence LeBlond for redOrbit.com – Your Universe Online

An HIV case first documented in a Mississippi baby 18 months ago is still proving that an antiviral treatment early on is effective in not only treating the virus that causes AIDS, but also curing it.

Earlier this year, researchers from Johns Hopkins University (JHU) reported that a child born with HIV and treated with a series of antiviral drugs showed signs of remission within days of initial treatment. The child was administered antiretroviral therapy (ART) for the next 18 months before ultimately being taken off the drugs.

The researchers, led by Deborah Persaud, MD, of JHU, conducted a follow-up in late 2012 when the child was 23 months of age and found that, after conducting a battery of tests, the infant was in full remission with no visible signs of HIV in the body. They presented their findings at the 2013 Conference on Retroviruses and Opportunistic Infections in Atlanta, Georgia.

REMISSION CONTINUES

Now, more than 6 months later, the same researchers have conducted another follow-up and are happy to report that the child, now 3 years old, is still free of active infection 18 months after all treatment ceased. A new report on the case has been published in the Oct. 23 issue of the New England Journal of Medicine (NEJM).

“Our findings suggest that this child’s remission is not a mere fluke but the likely result of aggressive and very early therapy that may have prevented the virus from taking a hold in the child’s immune cells,” says Dr. Persaud, a virologist and pediatric HIV expert at Johns Hopkins Children’s Center (JHCC), who has been handling the case since the child was born.

Persaud has worked on this case with immunologist Katherine Luzuriaga, MD, of the University of Massachusetts Medical School, and pediatrician Hannah Gay, MD, of the University of Mississippi Medical Center, who identified and treated the baby and continues to see the child.

“We’re thrilled that the child remains off medication and has no detectable virus replicating,” Gay says. “We’ve continued to follow the child, obviously, and she continues to do very well. There is no sign of the return of HIV, and we will continue to follow her for the long term.”

The child was born to an HIV-infected mother and was administered ART within 30 hours of birth. A series of tests in the subsequent days and weeks showed the ART was continuing to diminish the overall presence of the virus in the child’s blood, until it reached undetectable levels 29 days after birth. At 18 months of age, the child was lost to follow-up for nearly five months, and ART stopped; but when checked after another five months, testing could still not detect virus in the bloodstream.

Persaud and colleagues say this one child’s experience provides “compelling evidence” that infants infected with HIV can achieve remission when ART begins within hours or days of birth or whenever infection begins. Based on the information from this case, a federally-funded study is set to begin in 2014 to test early-treatment ART to determine whether the approach is feasible for all HIV-infected newborns.

Persaud and colleagues maintain that swift intervention likely led to the child’s remission because it halted the formation of hard-to-treat viral reservoirs – dormant HIV hiding in immune cells that can reignite in  patients within weeks of halting drug therapy.

“Prompt antiviral therapy in newborns that begins within hours or days of exposure may help infants clear the virus and achieve long-term remission without the need for lifelong treatment by preventing such viral hideouts from forming in the first place,” Persaud says.

BERLIN PATIENT

As previously reported by redOrbit, as well as other media sources, this is not the first case of remission of HIV in a person.

Timothy Brown, widely known as the “Berlin patient,” is the only other known person to be effectively cured of HIV. However, unlike with the Mississippi child, who was administered 18 months of ART, Brown was cured through a different means altogether.

In 2006, while receiving treatments for his HIV infection, Brown was diagnosed with leukemia. His cancer was subsequently treated with a stem-cell transplant from a person who was born with a genetic mutation causing immunity to HIV infection. Following that transplant, Brown was able to halt all HIV treatments without experiencing a return of HIV infection.

However, Brown’s case was unique and it is unlikely such procedures would be feasible for the general public of HIV patients due to the cost of such treatments. Also, a stem-cell transplant is typically viewed as necessary in a cancer scenario where often times the only other possible outcome is death, whereas HIV can usually be treated through antiviral therapy, allowing patients to lead otherwise normal lives.

In the Mississippi child, continuing tests for HIV-specific antibodies remain negative to date, as do tests to detect the presence of cytotoxic (killer) cells that the body deploys to destroy viral invaders and whose presence indicates active infection. Highly sensitive tests designed to hunt down trace amounts of the virus have detected intermittent viral footprints in the child’s system, says Persaud. However, these footprints appear incapable of forming new viral infections.

Furthermore, the child also exhibits none of the immune characteristics seen in the so-called “elite controllers,” a tiny percentage of HIV-infected people whose immune systems allow them to naturally keep the virus in check without treatment. Such people have immune systems that are revved up to suppress viral replication. So it is clear that early ART, rather than natural immune mechanisms, led to the child’s remission, the authors report.

PREEMPTIVE MEASURES

Currently, newborns who have a high-risk of HIV infection, either due to poorly controlled infections seen in the mother or in mothers who are diagnosed with HIV at time of delivery, are administered a preemptive combination of antivirals to prevent an infection. ART treatment does not start until an actual infection is confirmed. While a prophylactic approach is significant in preventing at-risk infants from acquiring HIV, it does nothing for those who are already infected with the virus. These infants would likely stand to benefit highly from prompt ART, as did the Mississippi baby.

“This case highlights the potential of prompt therapy to lead to long-term remission in those already infected by blocking the formation of the very viral reservoirs responsible for rekindling infection once treatment ceases,” says Luzuriaga, senior author of the NEJM report. “This may be particularly true in infants, whose developing immune systems may be less amenable to the formation of long-lived virus-infected immune cells.”

“Indeed, recent studies in HIV-infected infants have shown a marked reduction in the numbers of circulating virus-infected cells when babies are treated during the first few weeks of infection. Research has also shown that many hard-to-eradicate viral reservoirs begin to form very early, within weeks of infection. Taken together, these findings mean that the window of opportunity to achieve remission may close very quickly,” according to a JHCC statement.

Persaud and her colleagues maintain that despite the significance of treating HIV-infected newborns early on, preventing mother-to-child transmission remains the primary public health goal. The report authors caution that the approach is still considered preliminary and more research is needed to confirm the efficacy of such treatments. Also, they advise that children with confirmed HIV infection should not be taken off antiviral treatment.

According to background information in the study, nearly 3.3 million children around the world live with HIV infection. More than 260,000 acquire the virus from their mothers during delivery despite advances in preventing mother-to-child infection.

Chimpanzees Use Long-Term Memory To Forage

Lee Rannals for redOrbit.com – Your Universe Online

Chimpanzees rely on their long-term memory when searching for food throughout the rainforest, according to a new study published in the journal Animal Behavior.

Researchers studying whether chimpanzees aim their travel to particular rainforest trees to check for fruit found that the animals use long-term memory to remember the size and location of fruit trees and feeding experiences from previous seasons.

The team recorded the behavior of five chimpanzee females for continuous periods of four to eight weeks throughout multiple fruiting seasons in the Taï National Park, Côte d’Ivoire. The research period lasted for a total of 275 complete days. They also analyzed nearly 16,000 potential food trees with different crown sizes that were actually approached by the chimpanzees.

The scientists found that the animals fed on significantly larger trees than on other reproductively mature trees of the same species, especially if their fruits emitted an obvious smell.

Chimpanzees in the study checked most trees along the way during travel, but 13 percent were approached in a goal-directed manner. These target approaches were not initiated by visual cues and they occurred more often when females foraged alone and when trees were large.

Researchers determined that the chimpanzees were being guided by long-term memory of the location of large potential food trees. One observation in particular showed a chimpanzee was able to remember feeding experiences across fruiting seasons. The team said that long-term phenological data on individual trees indicated that the interval between successive fruiting seasons and the “memory window” of chimpanzees required for effective monitoring activities varied from two months to three years.

“The present study on chimpanzees is the first to show that our close relatives use long-term memory during their search for newly produced tropical fruit, and remember feeding experiences long after trees have been emptied”, said Karline Janmaat from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.

Christophe Boesch of the Max Planck Institute for Evolutionary Anthropology added that for a long time, people claimed that animals cannot remember the past. However, the researchers’ work shows this just isn’t true.

“This study helps us to understand why chimpanzees and other primates should remember events over long periods in time. And guess what? It also shows they do,” Boesch concluded.

Higher Blood Sugar Levels Linked To Impaired Memory

redOrbit Staff & Wire Reports – Your Universe Online

Individuals who have blood sugar at the lower end of the normal range perform better on memory tests than non-diabetics with higher glucose levels, according to new research published online Wednesday in the peer-reviewed medical journal Neurology.

Lead author Agnes Flöel of Charite University Medicine in Berlin and her colleagues recruited 141 individuals with an average age of 63, according to Nanci Hellmich of USA Today. None of those men and women had type 2 diabetes or prediabetes, and none of them had demonstrated any signs of cognitive or memory-related impairments.

“The study participants took a series of memory tests and had their blood sugar tested. They also had brain scans to measure the size of the hippocampus area, which plays an important role in memory,” Hellmich said. “The findings showed that chronically higher blood glucose levels exert a negative influence on memory.”

Specifically, the subjects were asked to review a list of 15 words, and then recall them 30 minutes later, according to Los Angeles Times reporter Mary MacVean. The ability to recall fewer words was associated with higher blood sugar levels, she said, and those individuals also were found to have less volume in the hippocampus – a region of the brain associated with both short- and long-term memory.

“Flöel says the findings suggest that even for people within the normal range of blood sugar, lowering their levels might be a possible way to prevent memory problems as they age,” Hellmich said. “She points out that the study is relatively small and doesn’t prove cause and effect. There’s a need for large clinical trials to test whether lowering glucose will help with the prevention of dementia.”

“Earlier research has shown ‘deleterious effects of diabetic glucose levels on brain structure, particularly the hippocampus,’ the researchers wrote,” according to MacVean. She added that they also explained that “impaired glucose tolerance and Type 2 diabetes also have been associated with lower cognitive function and a higher incidence of dementia, including Alzheimer’s disease.”

Robert Ratner, the chief scientific and medical officer with the American Diabetes Association, told USA Today that the results of the study only show an association between glucose levels and memory, not necessarily a causal relationship.

He explained that they have not shown that memory loss is caused by higher glucose levels, or that reducing blood sugar would improve recall. Even so, Ratner noted that it was “not surprising that glucose levels can potentially have these kinds of negative impacts. The risk of dementia is higher in people with diabetes. It has been well established that elevated glucose impacts brain function and recovery in people following a stroke.”

Big Story Weather – October 24, 2013

redOrbit Meteorologist Joshua Kelly
Big Story Weather from October 23:
A few snow showers moved through the Northern Plains and the Great Lakes yesterday. The biggest story of the day, however, was the cold air that plunged into the Southeast and also portions of the Gulf Coast.

Big Story Weather Discussion for October 24:
Surface Map 11AM: A clipper system will move through the Great Lakes bringing light snowfall to the region. High pressure will build in over the Plains bringing partly cloudy skies and cold air to the region. Another area of high pressure will settle in over the Northern Rockies bringing cool air to the region. Mild air will be over the West Coast. The Gulf Coast will also see partly cloudy skies and cool weather.
Surface Map 11PM: The clipper system will advance into the Northeast bringing a few rain showers to the region. High pressure will settle in over the rest of the US bringing cold air all the way down into the Gulf Coast and Southeast. The Plains will be cold as well and the West Coast will see partly cloudy skies and mild weather.
Severe Weather: No severe weather for today.
Winter Weather: There will be light snow over the Great Lakes. Snowfall rates should remain below 2 inches. The frost and freeze line will extend all the way down into Northern Mississippi and Alabama along with the Southeast. The frost and freeze line will then extend up into the Mid-Atlantic and back towards the Ohio River Valley.
Flooding: No flooding expected today across the country.
Tropical Weather: Tropical Depression Lorenzo has winds around 30kts and pressure near 1009mb and will continue to weaken and move away from land.
Winds: Strong gradient winds will be over the Great Lakes today with winds around 30-40mph creating large waves over the Western Lakes especially.

Today’s Spotlight Cities Forecast:
Baltimore MD: Partly cloudy with a high near 51F and overnight lows near 33F. Friday partly cloudy with a high near 52F and overnight lows near 32F. Saturday partly cloudy with high temps near 53F and overnight lows near 37F. Sunday partly cloudy with high temps near 55F and overnight lows near 37F. Monday partly cloudy with high temps around 58F and overnight lows near 40F.
Gulfport MS: Partly cloudy with a high near 74F and overnight lows near 49F. Friday partly cloudy with high temps around 66F and overnight lows near 45F. Saturday partly cloudy with a high near 68F and overnight lows near 55F. Sunday partly cloudy with a high near 74F and overnight lows near 64F. Monday partly cloudy with a high near 78F and overnight lows near 69F.
Omaha NE: Partly cloudy with a high near 47F and overnight lows near 32F. Friday partly cloudy with high temps around 56F and overnight lows near 40F. Saturday partly cloudy with high temps near 49F and overnight lows near 33F. Sunday partly cloudy with high temps around 60F and overnight lows near 38F. Monday mostly cloudy with showers. High temps around 54F and overnight lows near 47F. Total rainfall around 0.25 inches along with slight impacts over the area.
Billings MT: Partly cloudy with high temps around 55F and overnight lows near 42F. Friday partly cloudy with high temps near 60F and overnight lows near 38F. Saturday partly cloudy with high temps near 66F and overnight lows near 40F. Sunday mostly cloudy with moderate snowfall. High temps around 59F and overnight lows near 30F. Total snowfall around 1-3 inches with moderate impacts expected. Monday mostly cloudy with periods of heavy snowfall. High temps around 28F and overnight lows near 9F. Total snowfall around 2-4 inches which will lead to moderate impacts.
Seattle WA: Partly cloudy with a high near 70F and overnight lows near 46F. Friday partly cloudy with high temps around 64F and overnight lows near 40F. Saturday partly cloudy with high temps around 65F and overnight lows near 40F. Sunday partly cloudy with high temps around 60F and overnight lows near 37F. Monday partly cloudy with high temps around 57F and overnight lows near 33F. There will be a slight impact in the area for areas of frost.
Ask The Weatherman for October 23, 2013:
Question: What is meant by the term “upwelling?”
Answer: When we talk about upwelling, we are usually relating to an ocean process by which colder waters from the deeper ocean are being forced upwards to the surface and replace the warmer waters along the surface. This process is most common near the warm ocean currents off the east coast of Northern Hemispheres and the west coast of the Southern Hemispheres.
*** To have your question of the day answered or have your city spotlighted for the day make sure to visit redOrbit on Facebook. ***

October 24, 2013 Storm Tracker Update:
Latest Image: Typhoon Francisco has winds around 65kts and is moving northwest. The bottom image is Super-Typhoon Lekima with winds around 130kts. It is currently tracking to the west-northwest.
Eastern Pacific Ocean: Tropical Storm Raymond will start to move away from Western Mexico with strong winds and heavy rainfall. Winds are now around 40kts and pressure near 1003mb. Low pressure moving into Alaska will bring strong winds, heavy rainfall and mountain snows to the region. A clipper system will move through the Plains bringing light snow to the region.
Atlantic Ocean: Tropical Depression Lorenzo has winds around 30kts and pressure near 1009mb. The storm will remain over the open Atlantic and will have no impacts. Low pressure moving into the Northeast will bring a few showers to the region. Low pressure moving through Western Europe will bring showers to the region. Low pressure moving through Western China will bring showers and snow showers to the region. We are also watching an area in the tropics that has pushed off the coast of Africa for possible tropical development.
Western Pacific: Low pressure moving through Eastern Russia will bring snow showers to the region. Super Typhoon Lekima has winds around 130kts and pressure near 926mb and will continue to move towards the west-northwest and be near the island of Iwo Jima (Iwo To). Typhoon Francisco has winds around 60kts and pressure of 978mb and will continue to move along the Eastern shores of Japan bringing strong winds, heavy rain and storm surge to the region. There is another wave that is beginning to develop to the northeast of Guam that will have to be watched for possible tropical development.
Indian Ocean: There is an area of convection over the Bay of Bengal that is being watched for possible tropical development along with another area over the Central Indian Ocean.
Southern Hemisphere: Low pressure moving through Southern Argentina will bring showers and gusty winds to the region. Low pressure to the southeast of South Africa will bring strong winds to the region. A third low pressure area moving over Southern New Zealand will bring showers and strong winds to the region.
Major Weather Impacts Discussion for October 24, 2013:
Day 1-3: A fast moving clipper system will push across the Great Lakes and into the Northeast bringing light snowfall to the Great Lakes and a few showers to portions of the Northeast. High pressure will build in over the Northern Plains and extend into the Southeast and Gulf Coast bringing some cooler weather to the region. High pressure over the Rockies will keep the West dry and the West Coast mild and dry. Day two will have more of the same with high pressure dominating most of the country. A weak frontal boundary will clip through the Great Lakes bringing a few snow showers to the region. The period will end with the system moving into the Northeast bringing rain and snow showers to the area. High pressure will dominate the weather from the Northern Plains to the Southeast and Gulf Coast. A weak upper level feature will bring a few showers to portions of Texas. The West Coast will remain dry.
Day 4-7: The period will start with high pressure extending from the Northeast all the way back into the Gulf Coast and Southern Plains. The next winter storm will begin to push into the Northern Rockies bringing rain and heavy snowfall to the region. Day five high pressure will be over the Northeast extending back into the Gulf Coast, while the storm system moves into the Central Plains bringing heavy rainfall an also heavy snowfall to the Plains and the Northern Rockies. Day six the storm system will move into the Great Lakes bringing rain showers to the region. The cold front will extend into the Gulf Coast and Texas bringing rain and thunderstorms to the area. Some severe weather may be possible. Snowfall will be found over Colorado. The period will end with the low pressure in the Great Lakes bringing snow to the north, heavy rain to the Ohio River Valley and a strong squall line to the Gulf Coast.
Day 8-12: The period will start with the low pressure pushing into the Northeast bringing rainfall to the region. Some thunderstorms will be possible in the Carolinas. High pressure will be over the Great Lakes to the Gulf Coast. Another weak system will bring some light snowfall to the Northern Rockies. Day ten will have a few snow showers moving over the Great Lakes, otherwise high pressure will dominate most of the country again with partly cloudy skies and cool weather. The period will finish with high pressures over both coasts and a weak frontal boundary moving through the Plains bringing another shot of showers and snow showers to the Northern Plains and Western Lakes.
Long Range Outlook: The period will start with a frontal boundary moving across the Great Lakes and Ohio River Valley. Look for showers and a few snow showers to dominate the area. A weak storm system will enter the Northern Rockies bringing another shot of rain and snow to the area. During the middle of the period high pressure will be along the East Coast. A new strong storm system will be entering the Plains with showers and thunderstorms over the Southern Plains and rain and possibly heavy snowfall over the Northern Plains. The period will end with a storm system moving into the Great Lakes and also the Northeast bringing rain and some snowfall to the region. Another strong burst of cold air will plunge into the Plains as high pressure builds back in. A new storm system will enter the Pacific Northwest bringing with it showers and higher elevation snows.

Green Energy Report for October 24:
Wave Energy: There will be moderate amounts of energy along the Mid-Atlantic to the Northeast ahead of the cold front. There will be slight to moderate amounts of energy for both the Southeast and into the Gulf Coast. The entire West Coast will see moderate amounts of energy as the gradient from the high and the next low pressure will keeps strong winds over the waters.
Hydro Energy: There will be some periods of light rain over the Eastern Lakes and the Northeast which will bring some short term energy to the region.
Solar Energy: There will be ample amounts of solar energy from the Southeast to the Southern Plains, further west into the Southwest and all along the West Coast, back into the Rockies and the Northern Plains.
Wind Energy: There will be moderate amounts of wind energy over the Great Lakes behind the frontal boundary and also over portions of Northern Florida near the temperature gradient of the waters and the cold dense high pressure center.

Weather and Your Wallet for New York NY:
A clipper system to the west will bring increased clouds by later in the afternoon along with moderate westerly winds.
Dining: There will be no issues with enjoying a lunch at the park or eating outside at the many diners.
Transportation: There will be no weather issues along the major roadways or at the airports.
Shopping: Today will be a nice day to hit the stores and enjoy the fall sales.
Electricity: There will be some demand for heating to begin later this evening and last through the overnight hours. Around 21HDD’s are forecasted for the day.
Yard Work: There will be no forecasted impacts to enjoying time in the back yard today.
Construction: There should be no issues with the outdoor projects, however, with the high rise projects, the winds may get a little stronger.
Outdoor Venues: A nice day to take a walk in the park or down to the WTC museum.

The Pressure On Oregon To Cover Fibromyalgia Treatments

The Pressure On Oregon To Cover Fibromyalgia Treatments

Currently, the state of Oregon is the only state that does not have a Medicaid program that covers the fibromyalgia disorder and the treatment of its symptoms.

In recent weeks, many people in the state of Oregon have been trying to change this policy of Oregon state Medicaid.

On October 10th, doctors, physicians, and other advocates of the cause will attempt to convince the state’s health policymakers to add the fibromyalgia disorder to the Prioritized List. The list currently stands at 498 conditions and disorders, each of which is currently covered by Oregon’s Medicaid program.

One of the reasons that fibromyalgia has not already been included on the Prioritized List in Oregon is that the disorder has a long history of misdiagnosis and misunderstanding. For a long time, fibromyalgia was mistaken for severe cases of depression.

The disorder was confused as a psychological problem of sorts. However, this thinking has changed over the past decade. The World Health Organization has now recognized the disorder. The disorder has reached such prominence in recent years that next year it will soon see inclusion on the International Classification of Diseases list.

Private insurance companies for the most part cover treatment of fibromyalgia. It has become so prevalent that the Social Security Administration has declared that the fibromyalgia disorder can stand as grounds for disability assistance.

The disorder now stands as the third most common pain oriented condition. Fibromyalgia affects nearly five percent of women and about one percent of men. This disproportionate skew has not been explained.

As of today, nearly 117,000 citizens of Oregon are suffering from the disorder. It is not known what the exact cause of the disorder is, but it is known that the central nervous system plays at least a minor role in its development.

The symptoms include extreme fatigue, sleep disorders, severe depression, anxiety, severe headaches and migraines, and the most common, widespread chronic pain.

Currently, there are three medications that are administered to fibromyalgia patients. These treatments work on specific areas located in the central nervous system. Other less invasive treatments are recommended to patients including exercising and various forms of physical therapy.

The severity of symptoms may worsen if they are not treated properly. Advocates believe that it is important to provide coverage on treatments of fibromyalgia because proper diagnoses is a necessity and management of the symptoms is vital to a person with the disorder’s health.

Patients that rely on Medicaid to help them with medical care costs are limited in their options for what treatments they can receive for their symptoms.

These patients usually receive opiates as treatment, a treatment is very ineffective and can be dangerous for some patients.

Physical therapy is sometimes recommended, but patients deny reporting to therapy sessions because the costs of the treatments are too high for their budgets. Medicaid does not cover physical therapy if the reason that a patient is receiving the therapy for fibromyalgia treatment.

This has become very frustrating for many people living in Oregon and makes them feel that the state is living in the past.

Regular Exercise Boosts Teens’ Academic Performance, Especially Girls

[ Watch the Video: Teenagers’ Grades Could Benefit From Regular Exercise ]

Brett Smith for redOrbit.com – Your Universe Online

While the physical benefits of regular activity are well known and widely publicized, some research has found that there are cognitive benefits to regular exercise as well.

In a new study published Tuesday in the British Journal of Sports Medicine, UK researchers found an increase in regular exercise is linked to improved academic performance amongst teens. The association was particularly notable for the apparent benefits that exercise confers on girls in science subjects.

The academic improvements were seen over the long term, with the results indicating a dose-response effect, meaning more intensive exercise produced greater effects on test results.

In the study, which included about 5,000 British children, for every extra 17 minutes boys exercised and every extra 12 minutes girls exercised, researchers saw a marked academic improvement. Children who exercised regularly not only performed better at English, math and science at age 11, but also at 13 and in their exams at 16, the study researchers found.

While the additional physical activity was seen particularly beneficial to girls’ performance on science subjects, the authors said this could be simply a random finding or it could represent gender differences in the impact of physical activity on the brain.

“This is an important finding, especially in light of the current UK and European Commission policy aimed at increasing the number of females in science subjects,” the authors wrote in their report.

In reaching their conclusion, the UK scientists considered the children’s birth weight, mother’s age at delivery of the child, fish oil intake and smoking during the pregnancy, whether the child had reached puberty, current weight, and socioeconomic status as confounding factors that were accounted for.

In considering their findings, the authors wondered what might happen if the children increased the amount of moderate to vigorous physical activity they did to the recommended 60 minutes per day.

“If moderate to vigorous physical activity does influence academic attainment this has implications for public health and education policy by providing schools and parents with a potentially important stake in meaningful and sustained increases in physical activity,” they concluded.

They said since the study found an association between every 15 minutes of exercise and improved academic performance by an average of about a quarter of a grade – 60 minutes of daily exercise could improve their scores by a full grade; for example, from a B to an A.

The researchers concede this was pure speculation since very few children approached this amount of daily-recommended exercise.

“Physical activity is more than just important for your physical health,” study author Josie Booth, a psychologist from Dundee University, told BBC News. “There are other benefits and that is something that should be especially important to parents, policy-makers and people involved in education.”

The study author called for additional research that might be able to shed more light on their findings, which could have important implications for public health and education policies.

Molecular Clouds, Larson’s Law And The Mechanics Of Star Formation

Brett Smith for redOrbit.com – Your Universe Online

The molecular clouds that float about the universe hold the ingredients for star formation, and a new study from researchers at University of California, San Diego has confirmed the mechanics behind three observed relationships describing the internal forces acting within these clouds, called Larson’s Laws.

First postulated in 1981 by Richard Larson, a professor of astronomy at Yale, Larson’s Laws detail the observation-based relationships of the structure and supersonic internal movements of molecular clouds. The new study is based on recent observational data and the results of six computer simulations of the interstellar medium, including impacts of self-gravity, turbulence and magnetic fields. The researchers said their simulations support a turbulent interpretation of Larson’s relationships.

According to their report in the journal Monthly Notices of the Royal Astronomical Society, the UC San Diego researchers found that all three correlations are due to the same underlying physics: the properties of supersonic turbulence.

“After decades of inconclusive debate about the interpretation of the correlations among molecular cloud properties that I published in 1981, it’s gratifying to see that my original idea that they reflect a hierarchy of supersonic turbulent motions is well supported by these detailed new simulations showing that the debated complicating effects of gravity, magnetic fields, and multiphase structure do not fundamentally alter the basic picture of a turbulent cascade,” said Larson about the new study’s findings.

“This paper is essentially the culmination of seven years of research, aided by the use of large-scale supercomputer simulations conducted at SDSC and elsewhere,” said study author Alexei Kritsuk, a research physicist at UC San Diego. “Molecular clouds are the birth sites for stars, so this paper relates also to the theory of star formation.”

“None of these new findings and insights would have been possible without the tremendous advances in supercomputer simulations that allow not only cosmologists but scientists in countless other domains an unprecedented level of resolution and data-processing speed to further their research,” said study author Michael Norman, who has pioneered the use of advanced computational methods to explore the universe and its beginnings. “We believe that this paper paints the complete picture, drawing from earlier published works of ours as well as presenting new simulations that have not been published before.”

Earlier this month the Hubble Space Telescope discovered another so-called stellar nursery packed with molecular clouds. The discovery was particularly notable because it showed an effect called gravitational lensing.

First predicted by Albert Einstein, a gravitational lens occurs when light from one galaxy is bent, or distorted, by gravity. In the case of the new discovery, the light from the young “starburst dwarf” galaxy is being bent as it passes a nearer galaxy. The first observation of this effect was in 1979, a confirmation of Einstein’s earlier theory. The effect also gives researchers another tool to observe galaxies, the effect of gravity and dark matter.

The image captured by Hubble also showed an “Einstein Ring,” described as a “a perfect circle of light that is the projected and greatly magnified image of the distant light source.”

In the image, the closer galaxy is in the center while the projected light forms a ring around it, an extremely rare phenomenon according to Hubble researchers.

Breast Milk Protein May Protect Babies From HIV Infection

Duke University Medical Center

A substance in breast milk that neutralizes HIV and may protect babies from acquiring HIV from their infected mothers has been identified for the first time by researchers at Duke Medicine.

The protein, called Tenascin-C or TNC, had previously been recognized as playing a role in wound healing, but had not been known to have antimicrobial properties. The discovery could lead to potential new HIV-prevention strategies.

Reporting in the journal Proceedings of the National Academy of Sciences during the week of Oct. 21, 2013, the researchers describe how the TNC protein in breast milk binds to and neutralizes the HIV virus, potentially protecting exposed infants who might otherwise become infected from repeated exposures to the virus.

“Even though we have antiretroviral drugs that can work to prevent mother-to-child transmission, not every pregnant woman is being tested for HIV, and less than 60 percent are receiving the prevention drugs, particularly in countries with few resources,” said senior author Sallie Permar, M.D., Ph.D., assistant professor of pediatrics, immunology and molecular genetics and microbiology at Duke. “So there is still a need for alternative strategies to prevent mother-to-child transmission, which is why this work is important.”

Worldwide in 2011, an estimated 330,000 children acquired HIV from their mothers during pregnancy or birth, or through breastfeeding according to UNICEF. As international health organizations have set a goal of eliminating mother-to-child infections, researchers have worked to develop safe and affordable alternatives to antiretroviral therapy that can be used to block HIV transmission to infants.

Permar and colleagues focused on breast milk, which has long been recognized as having some protective quality that inhibits mother-to-child transmission despite multiple daily exposures over months and even years of nursing. Earlier studies had identified some antiviral properties in breast milk, but the majority of the HIV-neutralizing activity of breast milk remained unexplained. More recent studies pointed to a large protein that had yet to be identified.

In their study, the Duke team screened mature milk samples from uninfected women for neutralizing activity against a panel of HIV strains, confirming that all of the detectable HIV-neutralization activity was contained in the high molecular weight portion. Using a multi-step protein separation process, the researchers narrowed the detectable HIV-neutralization activity to a single protein, and identified it as TNC.

“TNC is a component of the extracellular matrix that is integral to how tissues hold themselves together,” Permar said, noting that co-author Harold Erickson, Ph.D., professor of cell biology at Duke, was among the first to identify and describe TNC in the 1980s. “This is a protein involved during wound healing, playing a role in tissue repair. It is also known to be important in fetal development, but its reason for being a component of breast milk or its antiviral properties had never been described.”

Further analysis described how TNC works against HIV by blocking virus entry. The protein is uniquely effective in capturing virus particles and neutralizes the virus, specifically binding to the HIV envelope. These properties provide widespread protection against infection.

“It’s likely that TNC is acting in concert with other anti-HIV factors in breast milk, and further research should explore this,” Permar said. “But given TNC’s broad-spectrum HIV-1-binding and neutralizing activity, it could be developed as an HIV-prevention therapy, given orally to infants prior to breastfeeding, similar to the way oral rehydration salts are routinely administered to infants in developing regions.”

Permar said TNC would also appear to be inherently safe, since it is a naturally occurring component of breast milk, and it may avoid the problem of HIV resistance to antiretroviral regimens that complicate maternal/infant applications.

“The discovery of the HIV inhibiting effect of this common protein in breast milk provides a potential explanation for why nursing infants born to HIV-infected mothers do not become infected more often than they do,” said Barton F. Haynes, M.D., director of the Duke Human Vaccine Institute. “It also provides support for inducing inhibitory factors in breast milk that might be even more protective, such as antibodies, that would completely protect babies from HIV infection in this setting.”

On The Net:

Seasonal Drying In The Amazon Has Greater Impact Than Previously Thought

redOrbit Staff & Wire Reports – Your Universe Online

New research suggests that the southern part of the Amazon Rainforest faces a higher risk of dieback (the gradual dying of plant shoots beginning at the tip) as a result of seasonal drying than reported by climate models used by the most recent Intergovernmental Panel on Climate Change (IPCC).

Furthermore, University of Texas at Austin Jackson School of Geosciences professor Rong Fu and her colleagues report that severe loss of rainforest could result in the release of massive amounts of carbon dioxide into the atmosphere, while also disrupting plants and animals in one of the regions of highest biodiversity on Earth.

Fu’s team reviewed 30 years of ground-based rainfall measurements and found that since 1979, the dry season in the southern part of the rainforest has increased by about one week per decade. During this same period, the annual fire season has also increased in length, and the researches blame global warming for the longer dry season.

“The dry season over the southern Amazon is already marginal for maintaining rainforest. At some point, if it becomes too long, the rainforest will reach a tipping point,” Fu said in a statement. “The length of the dry season in the southern Amazon is the most important climate condition controlling the rainforest. If the dry season is too long, the rainforest will not survive.”

The new paper, which is published this week in the journal Proceedings of the National Academy of Sciences (PNAS), contradict forecasts made by the climate models used in the IPCC report. Those models project the southern Amazonia dry season to increase by no more than 10 days by the end of the century, even under scenarios that predict dramatic increases in greenhouse gas levels. By those calculations, there should be a low risk of rainforest dieback caused by climate change.

“The researchers say the most likely explanation for the lengthening dry season in the southern Amazon in recent decades is human-caused greenhouse warming, which inhibits rainfall in two ways,” the university explained. “First, it makes it harder for warm, dry air near the surface to rise and freely mix with cool, moist air above. And second, it blocks cold front incursions from outside the tropics that could trigger rainfall.”

According to Fu and her associates, the climate models used by the IPCC do not do an adequate job representing the processes that result in prolonged dry seasons, potentially explaining why they foresee just a slight increase in those periods.

Typically, the rainforest removes carbon dioxide from the atmosphere, but during a severe drought eight years ago, it released a significant amount of the greenhouse gas, and the researchers believe that the phenomenon could occur again if the annual dry season continues to grow longer.

In fact, “if dry seasons continue to lengthen at just half the rate of recent decades, the Amazon drought of 2005 could become the norm rather than the exception by the end of this century,” the university said.

Fu added, “Because of the potential impact on the global carbon cycle, we need to better understand the changes of the dry season over southern Amazonia.”

Researchers from Duke University, the Universidad de Antioquia in Colombia, Columbia University’s International Research Institute for Climate and Society, the National Oceanic & Atmospheric Administration (NOAA), the National Center for Atmospheric Research (NCAR) and Boston University were also involved in the study. Their work was sponsored by the National Science Foundation (NSF) and the NOAA Climate Program Office Modeling, Analysis, Prediction and Projection Program.

Image Below: During the 2005 and 2010 droughts, satellites detected decreased vegetation greenness — or a lower Normalized Vegetation Index (NDVI) — over the southern Amazon rainforest (orange and red regions). NDVI is derived from MODIS instruments on NASA’s Terra and Aqua satellites. Credit: Image courtesy of Ranga Myneni, Jian Bi and NASA.

Unbounded Robotics Introduces One-Armed, Open-Source Robot

[ Watch the Video: Open-Source Robot From Unbounded Robotics ]
Brett Smith for redOrbit.com – Your Universe Online
When robotics research lab Willow Garage began losing employees to video-conferencing company Suitable Technologies in August, tech observers lowered their expectations for the release of an open source robot coming to market anytime soon.
However, a Willow Garage spinoff called Unbounded Robotics may prove any doubters wrong as the fledgling robotics developer announced Sunday it will begin shipping a one-armed, open source robot called UBR-1 in mid-2014.
“With decades of robotic hardware and software experience, we have developed a mobile manipulation platform that offers advanced software and a sophisticated hardware exterior,” reads a statement on the Unbounded Robotics website. “The robot offers mobility, dexterity, manipulation, and navigation in a human-scale, ADA-compliant model.”
The robot is expected to cost $35,000, which is 10 percent of the cost of Willow Garage’s PR2 robot. The UBR-1 comes with some upgrades on the capabilities of the PR2, which debuted in 2010.
“As Willow Garage alumni, we realize that UBR-1 will undoubtedly be compared to the PR2 robot from Willow Garage,” the Unbounded Robotics team continued. “The comparison is logical in some ways.”
“While UBR-1 is not specifically designed as the heir apparent for the PR2, we take pride in the comparison,” the company added. “UBR-1 offers a far more sophisticated platform than the PR2, however, which was originally designed more than five years ago. At a list price of $35,000, UBR-1 is also approximately one-tenth the cost of the PR2. UBR-1 is also capable of being deployed in business automation scenarios.”
The humanoid robot is equipped with a 3D sensor on its head, stereo microphones, a stereo speaker, an arm with 7 degrees of rotation, a lift that raises the bot 14 inches, two kinds of gripping capability, as well as a differential drive and a two-dimensional laser scanner that enable it to wheel around and ‘see’ where its going.
The company is emphasizing how UBR-1’s mobility and ability to navigate its environment give it an edge over some existing commercial competitors. The UBR-1 runs on the ROS open source operating system. This opens the door to potential uses by enterprising developers, the company said.
“As a platform for robotics, we are looking forward to seeing how UBR-1 is put to use in both R&D and commercial markets,” Unbounded’s statement continued. “Similar to an iPhone without any third-party apps, the greatest contribution of UBR-1 will be the output from the robotics community that is able to take advantage of this sophisticated mobile manipulation platform.”
“Unbounded’s UBR-1 is the natural heir apparent to the PR2 community, but at one-tenth the cost I anticipate strong uptake in the research and academic communities,” the startup told TechCrunch. “At the same time the UBR-1 robot is also capable of commercial deployments similar to Baxter, but with advanced navigation capabilities. Finally, it’s a great addition to the growing ROS community.
“Commercially, Baxter comes closest to competition. But Baxter works great when the robot doesn’t need to be mobile. Unbounded’s robot is able to move and navigate it’s environment,” it added.

Spanking Your Five Year Old May Result In More Aggressive Behaviors

Michael Harper for redOrbit.com – Your Universe Online

A new study conducted at Columbia University in New York finds five-year-old children who have been spanked are more likely to be aggressive and get into even more trouble. This research echoes prior studies which have found spanking to have quite the opposite intended effect on children.

Used as a disciplinary tool for many generations, spanking has only recently been questioned as an effective means of punishment. The Columbia study analyzed data from a long-term study which followed children born in one of 20 US cities between 1998 and 2000.

According to their research, children who were spanked at age five were more likely to become rule breakers in school and act aggressively towards their teachers and peers. The data revealed, however, that parents were more likely to spank their children when they were three years old rather than when they were five. Researchers also noted a difference in the child’s behavior depending on which parent was administering the spanking. The paper is now published in the journal Pediatrics.

“Most kids experience spanking at least some point in time,” explained lead author Michael MacKenzie with Columbia University in an interview with Reuters‘ Genevra Pittman.

After examining the data collected from nearly 2,000 children, MacKenzie and team say 57 percent of mothers and 40 percent of fathers spanked their child when they were three years old. This number dropped slightly by the time the child had grown to five years old; 52 percent of mothers and 33 percent of fathers spanked their kid at this age. Though there had been an average decrease in spankings, the behavior problems in these children were more pronounced when they were spanked at age five. Researchers did not find a direct link between behavioral problems and spanking at age three. Furthermore, these children were even more likely to act out if they had been spanked by their mother, even if it was only occasionally.

According to the research, mothers who spanked their children at least twice a week increased their child’s problem behaviors by two percent. This kind of behavior was measured on a 70-point scale and took family lifestyle and past behaviors into account.

Kids who were regularly spanked at age five by their fathers were also more likely to score lower in vocabulary tests, says MacKenzie’s research. The average vocabulary test score of nine-year-old children is 93. When the child was regularly spanked by their father at age five, however, these scores went down an average of four points. MacKenzie admits this decrease could be due to chance instead of a direct link to spanking.

Parents often spank their children as a way to get their attention and change their behavior quickly in the moment. While this is often effective in the short-term, MacKenzie says parents don’t think of the long-term ramifications of spanking.

“The techniques that are designed to promote positive behaviors … oftentimes take more effort and time to put into place,” said MacKenzie.

A 2005 study found that children who had been spanked, particularly children who lived in a culture where spanking was not widely embraced, were more likely to act out aggressively and display signs of anxiety. This aggression was not observed in children who were spanked in a culture that embraces the punishment.