Acid Reflux, Functional Dyspepsia Have Significant Impact On Disordered Sleep

Baclofen, esomeprazole investigated as treatments

The impact of upper GI conditions, like gastroesophageal reflux disease (GERD) and functional dyspepsia, on sleep””and treatments aimed at providing relief to heartburn/acid reflux patients who suffer from disordered sleep””were explored in three new studies related to sleep dysfunction presented today at the American College of Gastroenterology’s (ACG) 75th Annual Scientific meeting in San Antonio, Texas.

Functional dyspepsia is a common, but poorly understood, upper GI condition affecting approximately 10 percent of U.S. adults. The condition is described as chronic abdominal pain and a sensation of fullness, pressure or discomfort in the upper abdomen. This sensation is associated with eating as symptoms usually worsen after meals.

While the prevalence of disordered sleep in patients with functional dyspepsia is unknown, a new study unveiled today found that disordered sleep is significantly more common in functional dyspepsia patients than in healthy controls.

Patients with functional dyspepsia were 3.25 times more likely to have disordered sleep compared to healthy controls, according to the study, “Functional Dyspepsia: A Risk Factor for Disordered Sleep,” which also found that women with functional dyspepsia were 2.3 times more likely to have disordered sleep than men with this same condition. While gender tended to be associated with disordered sleep, age, tobacco and alcohol use was not a factor. The study also found that mental and physical factors were related to disordered sleep in patients with functional dyspepsia.

Routine exercise, for instance, appeared to decrease the likelihood of a patient suffering from sleep disorders. Functional dyspepsia patients also had higher scores for anxiety and depression, according to the study, suggesting that depression may be a contributing factor to functional dyspepsia symptom generation.

“Fatigue changes the sensation for pain, said Brian Lacy, M.D., Ph.D., associate professor of medicine, Dartmouth Medical School who presented the results of the study. “The key finding here is that disordered sleep may affect nerve function in upper GI tract which could lead to worsening dyspepsia, creating a vicious cycle leading to more pain and more insomnia,” said Dr. Lacy. He added that, “future clinical trials for functional dyspepsia should include validated measures of sleep, as improvements in functional dyspepsia symptoms may be mirrored by improvements in sleep.”

Esomeprazole Reverses Driving Impairment in GERD Induced Sleep Disorders

GERD”Âinduced sleep dysfunction has a previously unrecognized and significantly adverse affect on simulated driving performance, which improved with esomeprazole, according to the results of another study, “GERD”ÂInduced Sleep Disorders and a Reversible Driving Impairment with Esomeprazole”ÂA Prospective Pilot Study.”

Dr. David A. Johnson, Chief of Gastroenterology and Professor of Medicine at Eastern Virginia Medical School in Norfolk, Va., presented the findings from this prospective”Âpilot study that evaluated 11 healthy patients with well”Âestablished GERD with nocturnal symptoms.

Testing was done in a validated commercial driving simulator that responds to driver inputs (steering, throttle, brake) and generates realistic roadway images. Driving performance (standard deviation of lane variation SDLP) was compared across six consecutive 10″Âminute driving periods while subjects were on and off the drug.

According to the study, SDLP increased over time (p=0.0002) and improved with esomeprazole. Patients on esomeprazole had an overall average 62.5 percent decrease in the number of sleep disordered nights vs. a 9.5 percent decrease without the drug. The Epworth Sleepiness Scale, used to determine the level of daytime sleepiness, decreased to 5.9 and 3.5 from 7.9 and 2.5 and GERD symptom score decreased from 2.10 to 0.33.

“The improved ESS score suggests that reduced sleepiness contributed to improved performance,” said Dr. Johnson. “We know that GERD impairs sleep quality and next day function as measured by quality of life and work productivity assessments. Furthermore, sleep dysfunction (such as sleep apnea) has been linked to impaired psychomotor function including worsening driving simulator performance.

Therefore, appropriate treatment for patients with GERD and nocturnal symptoms may have potentially new and life”Âsaving implications.”

Dr. Johnson also noted that further prospective blinded controlled trials are warranted to validate these findings.

Baclofen Decreases Reflux, Improving Sleep Quality for Nighttime Heartburn Sufferers

Nighttime heartburn sufferers also may get relief “”and better sleep quality, from the muscle”Ârelaxant and antispastic drug, baclofen, according to results of another new study unveiled today, “Baclofen Decreases Reflux and Improves Sleep Quality in Individuals with Nighttime Heartburn.”

While baclofen has been shown to reduce episodes of GERD, this new study found that in addition to reducing the number of reflux events during sleep, baclofen significantly improved several measures of sleep in patients with documented GERD and sleep disturbances.

“About 70 percent of individuals who have GERD also suffer from nighttime heartburn, and 40 percent of those people say they experience disturbed sleep at night,” said study co”Âauthor Dr. William Orr, president and CEO of the Lynn Health Science Institute and a clinical professor of medicine at the University of Oklahoma Health Sciences Center. “They don’t feel good the next day and they don’t perform as well.”

Approved by the FDA in 1977, bacolfen is typically used by neurologists to treat uncontrolled movements, such as shakes and tremors. The drug inhibits nerve activity within the part of the brain that controls the contraction and relaxation of skeletal muscles.

“In this study, we found that bacolfen significantly reduces the amount of waking which occurs after the onset of sleep,” said Dr. Orr. “Baclofen addresses the physiological causes of reflux, by preventing the relaxation of the lower esophageal sphincter and preventing the stomach acid from entering the esophagus. Few drugs inhibit the occurrence of reflux and 40 to 50 percent of those taking PPIs don’t get satisfactory relief, especially at nighttime,”

Baclofen reduced the number of reflux events compared to a placebo (4 events vs. 1.3). Patients on baclofen also had more sleep time (434 minutes vs. 379 minutes) and greater sleep efficiency (91 percent vs. 79 percent), according to the study. “The results of this study suggest that baclofen could be a useful adjunct therapy to proton pump inhibitors in patients with nighttime heartburn and sleep disturbance,” said Dr. Orr.

On the Net:

Yale University Researchers Find Key Genetic Trigger Of Depression

Yale University researchers have found a gene that seems to be a key contributor to the onset of depression and is a promising target for a new class of antidepressants, they report Oct. 17 in the journal Nature Medicine.

“This could be a primary cause, or at least a major contributing factor, to the signaling abnormalities that lead to depression,” said Ronald S. Duman, professor of psychiatry and pharmacology at Yale and senior author of the study.

Scientists have had a difficult time pinning down the cause of depression, which afflicts almost 16 percent of Americans in any given year and carries an annual economic burden of $100 billion.

Symptoms of depression vary widely among individuals. Most now believe that multiple physiological processes are involved in major depressive disorder. That explains why people respond differently to most commonly prescribed antidepressants, which work by manipulating the uptake of the neurotransmitter serotonin. However, as many as 40 percent of depressed patients do not respond to currently available medications, which take weeks to months to produce a therapeutic response.

Duman’s team did whole genome scans on tissue samples from 21 deceased individuals who had been diagnosed with depression and compared gene expression levels to those of 18 individuals who had not been diagnosed with depression. They found that one gene called MKP-1 was increased more than two-fold in the brain tissues of depressed individuals.

This was particularly exciting, say the researchers, because the gene inactivates a molecular pathway crucial to the survival and function of neurons and its impairment has been implicated in depression as well as other disorders. Duman’s team also found that when the MKP-1 gene is knocked out in mice, the mice become resilient to stress. When the gene is activated, mice exhibit symptoms that mimic depression.

The finding that a negative regulator of a key neuronal signaling pathway is increased in depression also identifies MKP-1 as a potential target for a novel class of therapeutic agents, particularly for treatment resistant depression.

On the Net:

Project Guides European Farmers From Space

Farmers traditionally keep a close eye on their fields, but a new ESA-led project seeks to build on their vigilance with monitoring from space.

The TalkingFields initiative is now showing how to combine satellite observation with satellite navigation to benefit European farmers.

Sustainable food production and food security are critical challenges. TalkingFields will help by using precision farming methods to produce crops more efficiently. For instance, by optimizing farmers’ use of fertilizer and giving early warning of plant disease risks, both costs and environmental impacts can be reduced.

“There are existing services variously employing Earth observation data, satellite navigation, farm management software and crop growth models, but TalkingFields is the first to combine them all,” said ESA’s Tony Sephton.

“We’re setting up an end-to-end service that is simple to use and sufficiently cost-effective to be self-sustaining.”

How does it work? The farmer requests the service for an area defined using satnav. Satellites gather information on the land’s potential ““ observations over several years can reveal variations in crop growth through soil changes ““ as well as current crop status.

These results are combined with information from field sensors such as weather conditions and soil moisture. The farmer adds in his own knowledge, and in return receives detailed satnav instructions on where and how much fertilizer to spray, for example.

A variety of satellites can be employed, although priority will be given to free data sources such as Landsat and ESA’s forthcoming Sentinel-2 satellites, due for launch in 2012.

“Ideally, we might have weekly satellite acquisitions, but cloud cover makes that unfeasible,” explained Dr Sephton.

“Instead, we need only two to four satellite images per growing season, which are fed into a sophisticated crop growth model.

“With TalkingFields the emphasis is on service: we’re not giving raw satellite data straight to farmers. Instead, we advise them directly on actions to be taken throughout the growing season.”

Following a 2009 feasibility study, TalkingFields is now being demonstrated in real fields, led for ESA by German Earth observation company VISTA with partners PC-Agrar, a German company specializing in providing farm management information software, and Ludwig Maximillians University Munich, which developed the hydrological and agricultural production model.

Farmers access TalkingFields via familiar farm management systems. “The quality of farming advice improves dramatically when all the available information is used,” said Heike Bach of VISTA.

“Factors like crop variety, seeding date, row distance and fertilization measures conducted so far are stored in the farm management system.

“Since TalkingFields is integrated with this software, we also have access to this information, improving our crop growth models.”

Large intensive farms across Germany and Russia are participating in the demonstration. Customers can choose from a portfolio of services, such as estimating a crop’s yield some two to four weeks before harvest.

Even before a farmer decides to use precision farming, he can obtain a detailed cost-benefit analysis for each field. Daily information on biomass and density will help to protect crops by revealing the onset of plant disease.

TalkingFields is being supported through the Integrated Applications Promotion (IAP) program of ESA’s Telecommunications and Integrated Applications Directorate.

IAP builds services for new groups of users by combining different space and terrestrial systems in novel ways.

Image 1: TalkingFields is an initiative of ESA’s Integrated Application Promotion program, aimed at producing an end-to-end precision farming information service. The farmer requests advice for an area defined using satnav. Satellites gather information on the land’s potential ““ observations over several years can reveal variations in crop growth through soil changes ““ as well as current crop status. These results are combined with information from field sensors such as weather conditions and soil moisture. The farmer adds in his own knowledge, and in return receives detailed satnav instructions on where and how much fertilizer to spray, for example. In principle these instructions can be fed straight into tractors or other farm equipment. Credits: TalkingFields

Image 2: Sustainable food production and food security are critical challenges. An initiative of ESA’s Integrated Application Promotion program, TalkingFields will help by using space-based precision farming methods to produce crops more efficiently. For instance, by optimizing farmers’ use of fertilizer and giving early warning of plant disease risks not just on a field-by-field basis but within individual fields, both costs and environmental impacts can be reduced. Credits: TalkingFields

On the Net:

New CPR Guidelines Emphasize Chest Compressions

The American Heart Association released new cardiopulmonary resuscitation (CPR) guidelines on Monday, placing a greater emphasis on the use of chest compressions and advising would-be rescuers to not stop and listen for breathing before beginning the procedure.

The new guidelines, which have been published in Circulation: Journal of the American Heart Association, move away from the organization’s long-utilized “ABCs of CPR” method. The “ABC” method, which stood for Airway-Breathing-Compressions, should now be replaced with the CAD method, which places compressions before pulmonary treatment.

“For more than 40 years, CPR training has emphasized the ABCs of CPR, which instructed people to open a victim’s airway by tilting their head back, pinching the nose and breathing into the victim’s mouth, and only then giving chest compressions,” Michael Sayre, co-author of the guidelines and the chairman of the American Heart Association’s Emergency Cardiovascular Care (ECC) Committee, said in a statement on Monday.

“This approach was causing significant delays in starting chest compressions, which are essential for keeping oxygen-rich blood circulating through the body. Changing the sequence from A-B-C to C-A-B for adults and children allows all rescuers to begin chest compressions right away,” Sayre added.

Furthermore, the new CPR guidelines suggest that lay or professional rescuers should begin compressions immediately on anyone–except newborns–if they are not responsive and/or not breathing normally. They also recommend that the rate of chest compressions be increased to 100 times per minute and should be deeper on the chest, with a compression rate of 1.5 inches on infants and at least two inches on children and adults. They also advise rescuers to avoid leaning on the chest in between compressions, in order to allow the chest to return to its normal position.

“In the first few minutes of a cardiac arrest, victims will have oxygen remaining in their lungs and bloodstream, so starting CPR with chest compressions can pump that blood to the victim’s brain and heart sooner,” the American Heart Association said in the October 18 press release. “Research shows that rescuers who started CPR with opening the airway took 30 critical seconds longer to begin chest compressions than rescuers who began CPR with chest compressions.”

Earlier this month, a University of Arizona study found that hands-only CPR could be more effective than mouth-to-mouth resuscitation. The researchers looked at more than 4,400 instances of adult cardiac arrest that occurred in non-hospital settings between 2005 and 2009, and discovered that 13% of those who received hands-only resuscitation survived, compared to 8% who were given conventional CPR. The Arizona study was published in Journal of the American Medical Association (JAMA).

On the Net:

Kilimanjaro Climb Deadly For The Unprepared

Following the successful ascent of Mount Kilimanjaro by nine UK celebrities during a 2009 charity event, more and more people are feeling compelled to challenge Africa’s highest mountain–a decision which, according to a new University of Edinburgh study, could be fatal if they don’t prepare correctly.

According to BBC News, travel agencies have reported an increase in bookings to those looking to scale the over 19,000-foot Tanzanian peak following the March 2009 climb successfully completed during the Comic Relief charity event. That climb saw celebrities Gary Barlow, Ronan Keating, Chris Moyles, Ben Shephard, Cheryl Cole, Kimberley Walsh, Denise Van Outen, Fearne Cotton, and Alesha Dixon reach the summit, apparently inspiring many others to attempt and duplicate the feat.

However, Edinburgh scientist Stewart Jackson warns that many of those looking to climb Kilimanjaro “know little or nothing” about the dangers of high altitude rock climbing. In a study, Jackson, his colleagues, and more than 200 would-be climbers camped out at the African mountain for three weeks. They scaled to a height of approximately 15,500 feet, and used the Lake Louise Consensus guide to diagnose the different symptoms of altitude illness in the subjects.

As it turns out, nearly half of those who participated (47%) began showing signs of altitude sickness, including vomiting, headaches, fatigue, coordination problems, breathing problems and insomnia. Furthermore, Jackson and his colleagues discerned that the climbers were ascending too high, too quickly–not giving their bodies enough time to acclimate to the conditions.

“We found that many climbers knew little or nothing about altitude sickness and did not have previous experience of being at high altitude,” Jackson told the BBC, adding that he believed that these findings emphasized “the need to increase awareness of the risks of altitude sickness and the importance of taking your time to acclimatize.”

Jackson, who published his findings in the journal High Altitude Medicine and Biology, said that undergoing a climbing expedition on a smaller mountain before attempting to scale Kilimanjaro “offers climbers the best chance of a safe, successful summit.” Otherwise, individuals can face such health concerns as fluid buildup on the lungs or brain–also known as high altitude cerebral edema (HACE) and high altitude pulmonary edema (HAPE)–due to the lack of oxygen.

On the Net:

UN Biodiversity Conference Begins With Call To Action

Delegates from the United Nations (UN) met to discuss ways to protect plant and animal life as a 12-day international conference on biodiversity kicked off Monday in Nagoya, Japan.

The meetings feature over 8,000 representatives from all 193 member nations of the UN Convention on Biological Diversity (CBD), and according to Associated Press (AP) writer Malcom Foster, they are attempting to iron out a list of 20 specific target goals to achieve over the next decade.

Ahmed Djoghlaf, executive secretary of the CBD, opened the event by calling the summit a “defining moment” for mankind, according to the AFP’s Kyoko Hasegawa–especially in light of past failures to live up to promises of protecting biodiversity.

“Let’s have the courage to look into the eyes of our children and admit that we have failed individually and collectively to… to substantially reduce the rate of loss of biodiversity by 2010,” he said. “Let us look into the eyes of our children and admit that we continue to lose biodiversity at unprecedented rates.”

“The time to act is now and the place to act is here,” Djoghlaf added. “Business as usual is no more an option when it comes to life on Earth… we need a new approach, we need to reconnect with nature and live in harmony with nature.”

According to Hasegawa, the International Union for Conservation of Nature (IUCN) reports that approximately 25% of mammals, 33% of amphibians, 12% of birds, and 20% of plant species currently face the threat of extinction. Furthermore, the AFP reporter says that international conservation group WWF states that people are currently living at a level exceeding the Earth’s biocapacity by more than 50-percent, and that “by 2030 humans will effectively need the capacity of two Earths.”

Likewise, Foster reports that scientists estimate that our planet’s species are dying out at a rate somewhere between 100 and 1,000 times the historical average, and that Harvard University biologist E.O. Wilson and other experts assert that the Earth is experiencing “a man-made environmental crisis” which is currently leading the planet en route to “its sixth big extinction phase, the greatest since the dinosaurs were wiped out 65 million years ago.”

In addition to setting tangible goals to protect plant and animal life and to save habitats from deforestation and pollution, the delegates will also discuss a proposed Access and Benefits Sharing Protocol (ABS) plan that would reimburse countries for medically-useful natural resources discovered within their borders. The ABS plan would require scientists to pay a so-called “gene fee” for any plant or animal life for which a medical or scientific use is discovered.

On the Net:

Babies Treat ‘Social Robots’ As Sentient Beings

Babies are curious about nearly everything, and they’re especially interested in what their adult companions are doing. Touch your tummy, they’ll touch their own tummies. Wave your hands in the air, they’ll wave their own hands. Turn your head to look at a toy, they’ll follow your eyes to see what’s so exciting.

Curiosity drives their learning. At 18 months old, babies are intensely curious about what makes humans tick. A team of University of Washington researchers is studying how infants tell which entities are “psychological agents” that can think and feel.

Research published in the October/November issue of Neural Networks provides a clue as to how babies decide whether a new object, such as a robot, is sentient or an inanimate object. Four times as many babies who watched a robot interact socially with people were willing to learn from the robot than babies who did not see the interactions.

“Babies learn best through social interactions, but what makes something ‘social’ for a baby?” said Andrew Meltzoff, lead author of the paper and co-director of the UW’s Institute for Learning and Brain Sciences. “It is not just what something looks like, but how it moves and interacts with others that gives it special meaning to the baby.”

The UW researchers hypothesized that babies would be more likely to view the robot as a psychological being if they saw other friendly human beings socially interacting with it. “Babies look to us for guidance in how to interpret things, and if we treat something as a psychological agent, they will, too,” Meltzoff said. “Even more remarkably, they will learn from it, because social interaction unlocks the key to early learning.”

During the experiment, an 18-month-old baby sat on its parent’s lap facing Rechele Brooks, a UW research assistant professor and a co-author of the study. Sixty-four babies participated in the study, and they were tested individually. They played with toys for a few minutes, getting used to the experimental setting. Once the babies were comfortable, Brooks removed a barrier that had hidden a metallic humanoid robot with arms, legs, a torso and a cube-shaped head containing camera lenses for eyes. The robot ““ controlled by a researcher hidden from the baby ““ waved, and Brooks said, “Oh, hi! That’s our robot!”

Following a script, Brooks asked the robot, named Morphy, if it wanted to play, and then led it through a game. She would ask, “Where is your tummy?” and “Where is your head?” and the robot pointed to its torso and its head. Then Brooks demonstrated arm movements and Morphy imitated. The babies looked back and forth as if at a ping pong match, Brooks said.

At the end of the 90-second script, Brooks excused herself from the room. The researchers then measured whether the baby thought the robot was more than its metal parts.

The robot beeped and shifted its head slightly ““ enough of a rousing to capture the babies’ attention. The robot turned its head to look at a toy next to the table where the baby sat on the parent’s lap. Most babies ““ 13 out of 16 ““ who had watched the robot play with Brooks followed the robot’s gaze. In a control group of babies who had been familiarized with the robot but had not seen Morphy engage in games, only three of 16 turned to where the robot was looking.

“We are using modern technology to explore an age-old question about the essence of being human,” said Meltzoff, who holds the Job and Gertrud Tamaki Endowed Chair in psychology at the UW. “The babies are telling us that communication with other people is a fundamental feature of being human.”

The study has implications for humanoid robots, said co-author Rajesh Rao, UW associate professor of computer science and engineering and head of UW’s neural systems laboratory. Rao’s team helped design the computer programs that made Morphy appear social. “The study suggests that if you want to build a companion robot, it is not sufficient to make it look human,” said Rao. “The robot must also be able to interact socially with humans, an interesting challenge for robotics.”

The study was funded by the Office of Naval Research and the National Science Foundation. Aaron Shon, who graduated from UW with a doctorate in computer science and engineering, is also a co-author on the paper.

Image Caption: Andrew Meltzoff, co-director of the University of Washington’s Institute for Learning and Brain Sciences, and Rajesh Rao, University of Washington associate professor of computer science and engineering, with the humanoid robot used to demonstrate “social” interactions to babies. Credit: University of Washington

On the Net:

Campaign To Raise Funds For Analytical Engine Underway

An online campaign to build the analytical engine–a massive device similar in design to a modern day computer that was first described by British mathematician Charles Babbage in the late 1830s–has drawn donations from over 1,600 people, according to BBC News reports.

The campaign, which was the brainchild of computer programmer and author John Graham-Cumming, is seeking donations from 50,000 people. If the goal is met, Graham-Cumming has vowed to build the engine, which pre-dated the first completed general-purpose computer by approximately 100 years.

Babbage was never able to complete the analytical engine before his 1871 death. Others, including Babbage’s son, Henry Prevost Babbage, have successfully completed parts of the machine, but a full, working model has never been completed.

“It’s an inspirational piece of equipment,” Graham-Cumming, Vice-President of Engineering for the software company Causata and author of a travel book for scientists called The Geek Atlas, told BBC Technology Reporter Jonathan Fildes on Thursday. “A hundred years ago, before computers were available, [Babbage] had envisaged this machine.”

“What you realize when you read Babbage’s papers is that this was the first real computer,” he added. “It had expandable memory, a CPU, microcode, a printer, a plotter and was programmable with punch cards”¦ It was the size of a small lorry and powered by steam but it was recognizable as a computer.”

If the fund raising drive is successful, Graham-Cumming will use a design known as Plan 28 in order to complete the analytical engine. In order to do so, first Babbage’s papers and designs, which are currently housed at the London Science Museum, would have to be digitized. Then, he would need to develop a 3D computer simulation of the device. The actual, finished produce would be “bigger than a steam locomotive,” Graham-Cumming told Fildes.

Image Caption: Trial model of a part of the Analytical Engine, built by Babbage, as displayed at the Science Museum (London). Credit: Bruno Barral – Wikipedia

On the Net:

Next-Gen Liquid Scanner Undergoes Airport Testing

Homeland security officials demonstrated a “potential next-generation” liquid and gel scanner on Wednesday that could allow airline passengers to once again bring items like shampoo bottles and carbonated beverages with them during their flights.

Developed at the Los Alamos National Laboratory (LANL), the Magnetic Vision Bottled Liquid Scanner, or MagViz BLS, was put to the test at the Albuquerque International Sunport Wednesday morning. According to a LANL press release, the device employs “ultra-low-field nuclear magnetic resonance technology for the checkpoint detection of liquid explosives.”

Associated Press (AP) writer Susan Montoya Bryan, who was in attendance at the event, said that “everything from bottled water and champagne to shampoo and pink liquid laxatives were scanned to make sure explosives weren’t hiding inside.”

“The device, about the size of a small refrigerator, uses magnetic resonance to read the liquids’ molecular makeup, even when the substances are in metal containers,” she added. “Within 15 seconds, a light on top of the simple-looking metal box flashes red or green, depending on whether there’s danger”¦ The device is so sensitive it can tell the difference between red and white wine, and between different types of soda.”

The MagViz Bottle Liquid Scanner “distinguishes potential-threat liquids from the harmless shampoos and sodas a regular traveler might take aboard an aircraft,” the LANL said in a statement, adding that nearly all liquids are limited to one-container per person per item, with containers being no larger than 3.4 ounces in size.

According to Bryan, the device is still several years from actually being utilized in American airports. Homeland Security Advanced Research Projects Agency Program (HSARPA)  Manager Stephen Surko told the AP that the LANL must still reach a deal with a manufacturer for the MagViz, and will then have to submit the machines for testing and certification.

On the Net:

US Has Most Botnet-infected PC’s

A report has found that the U.S. is the number one country consumed with botnet PCs.

The study, which was performed by Microsoft, said over 2.2. million PCs in the U.S. were found to be part of botnets in the first six months of 2010.  Brazil came in second place with 550,000 botnets.

The research found that 14.6 out of every 100 machines in South Korea were botnets.

Cliff Evans, head of security and identity at Microsoft U.K., said the research was done to alert people of the growing threat of botnets.

“Most people have this idea of a virus and how it used to announce itself,” he told BBC. “Few people know about botnets.”

Cyber criminals use botnets in order find information that can be sold on underground auction sites and markets found online.

“Once they have control of the machine they have the potential to put any kind of malicious code on there,” said Evans. “It becomes a distributed computing resource they then sell on to others.”

The study found that a bonnet known as Lethic sent out 56 percent of all bonnet spam between March and June even though it was only on 8.3 percent of all known botnet IP addresses.

“It’s phenomenal the amount of grip that thing has,” said Mr Evans.

In three months between April and June 2010, Microsoft cleaned up over 6.5 million infections, which is twice as much as the same time period in 2009.

There were 600 million PCs that use Microsoft’s various update services or use its Essentials and Defender security package that were enrolled in the study.

Evans told BBC that defending against malware was a straightforward battle.

He said people should use automatic updates, as well as make sure they regularly use anti-virus software and run a firewall.

Microsoft has just issued its largest list of fixes for flaws in Windows, Internet Explorer and a range of other software.

The update includes security patches for 49 vulnerabilities.

“With the significant number of holes identified on the same day, businesses will be racing against time to fix them all,” Alan Bentley, senior vice president at security firm Lumension, told BBC.

“Not only is this Microsoft’s largest patch load on record, but 23 of the vulnerabilities are rated at the most severe level,” he added.

On the Net:

Cyber Threats Very Real For Britain: Official

The head of Britain’s electronic spying agency warned Wednesday that the country is facing a “real and credible” threat of cyber attacks from hostile criminals abroad which could potentially damage its critical infrastructure.

Iain Lobban, director of the Government Communications Headquarters (GCHQ), said Britain’s infrastructure — such as emergency services and power grids — was at an increased risk as the rapid growth of the Internet made communications systems more vulnerable.

“We already provide expert advice and incident response to the operators of critical services,” he said. “We must continue to strengthen these capabilities and be swifter in our response, aiming to match the speed at which cyber events happen.”

Speaking at the International Institute for Strategic Studies in London, Lobban said he didn’t want to go into great detail about the threat to the UK’s “critical national infrastructure.” But said the threat posed by terrorists, organized criminals and hostile foreign governments was “real and credible” and he demanded a quicker response to match the speed with which cybercrime happened.

He warned that Britain’s economy could be at risk if effective protection measures against cyber attacks was not further developed.

Putting such protection in place would help “the UK’s continuing economic prosperity,” he said. “A knowledge economy needs to protect from exploitation the intellectual property at the heart of the creative and high-tech industry sectors.”

Lobban conceded his comments came as the coalition government prepares to give full details of sweeping cuts to defense and public sector spending next week. But, he argued, the cyber attack risk was not just a “national security or defense issue.”

“It goes right to the heart of our economic well-being and national interest.”

While GCHQ is more readily associated with electronic intelligence-gathering, Lobban stressed that it also has a role in security, referred to as “information assurance.” He said significant disruptions have already been caused in government systems by worms and viruses, though, not all were of deliberate consequence.

Each month there were more than 20,000 malicious emails on government computer networks, of which 1,000 were deliberately targeted at them. He said that intellectual property theft was also taking place on a “massive scale.”

Lobban said that while 80 percent of the threats could be dealt with through good information assurance practice, the remaining 20 percent was more complex and could not simply be solved by building better security walls.

While cyberspace presents potential security risks to the UK, Lobban said that it also offered opportunities if the UK could get its defenses right.

“There’s a clear defensive angle. In order to flourish, a knowledge economy needs to protect from exploitation the intellectual property at the heart of the creative and high-tech industry sectors. It needs to maintain the integrity of its financial and commercial services,” Lobban told BBC News.

“There is an opportunity which we can seize if government and the telecommunications sector, hardware and software vendors, and managed service providers can come together.”

It provides an opportunity to develop a “holistic approach to cyber security that makes UK networks intrinsically resilient in the face of cyber threats,” he said.

“That will lead to a competitive advantage for the UK. We can give enterprises the confidence that by basing themselves here they gain the advantages of access to a modern internet infrastructure while reducing their risks.”

On the Net:

USDA Funds Study Of Psychology In School Lunch Lines

The U.S. Department of Agriculture announced on Tuesday a $2 million initiative to help food behavior scientists find new ways to use psychology to fight childhood obesity and improve the federal school lunch program.

“Findings from this emerging field of research “” behavioral economics “” could lead to significant improvements in the diets of millions of children across America,” said Agriculture Secretary Tom Vilsack in a statement.

“Across the nation, many schools are already taking steps to provide students with healthier meals and the nutrition knowledge to make healthier choices. However, it is well recognized that understanding the value of a healthy diet does not always translate into healthy choices.”

Previous studies have shown that even subtle changes, such as displaying fruits and vegetables in pretty baskets instead of steel bins, or having a cash-only policy for desserts, can help children make healthier school lunch choices. But more research is needed, Vilsack said.

“Research has shown that good intentions may not be enough: when choosing what or how much to eat, we may be unconsciously influenced by how offers are framed, by various incentives, and by such factors as visual cues.”

“The emerging field of behavioral economics draws on research from the fields of economics and social psychology to better understand behavior. This research can suggest practical, cost-effective ways that the school environment can better support healthful choices.”

About one-third of U.S. children are classified as overweight or obese. Bans on soda and junk food have backfired in some places, and some students have entirely abandoned school meal programs that tried to force them to consume healthier foods.

The Associated Press (AP) cited one school district that put fruit on every lunch tray, only to find most of it discarded in the trash.

These schools are now looking for a new approach, betting that children may be more likely to eat healthier foods if they feel it is something they have selected for themselves.

“It’s not nutrition till it’s eaten,” USDA researcher Joanne Guthrie told AP.

Part of the USDA’s initiative will establish a child nutrition center at Cornell University, which has long conducted this type of research.

Cornell researchers have already found that some measures, such as keeping ice cream in freezers without a glass display top, improve the food choices of children.  Other measures, such as moving salad bars next to checkout registers, also work well.

Last year, the USDA sought advice from the Institute of Medicine on how to improve its school lunch and breakfast programs, which provide free or subsidized meals each day to more than 31 million U.S. schoolchildren.

The Institute advised offering more fruit, vegetables and whole grains, while limiting fat, salt and calories.  However, it was clear this wouldn’t work without the children first accepting these healthier foods, Guthrie told AP.

“We can’t just say we’re going to change the menu and all of our problems will be solved,” she explained.

The USDA requested proposals from researchers on how to get kids to actually select and eat the healthier foods.

The agency selected Cornell scientists Brian Wansink and David Just, who will receive $1 million to establish the Center for Behavioral Economics in Child Nutrition Programs at Cornell University in Ithaca, New York.  

The remaining $1 million will fund 14 other research projects in Connecticut, Iowa, Louisiana, Minnesota, Oklahoma, Pennsylvania, South Carolina, Texas, Utah, West Virginia, and Wisconsin, the USDA said.

Cornell will focus on developing “smart lunchrooms” that guide children to make good food choices even when faced with more tempting ones.

“We’re not taking things away from kids,” said Wansink, a prominent food science researcher known for his studies on the depiction of food in paintings of the Last Supper.

“It’s making the better choice the easier, more convenient choice,” he told AP.

Christine Wallace, food service director for the Corning City School District near Cornell University, met Wansink a few years ago and invited him to use her fourteen schools as a laboratory.

“We tend to look at what we’re offering and to make sure it’s well prepared and in the correct portion size, and not the psychology of it. We’re just not trained that way,” Wallace told the AP.

She recounted a time when some Corning schools had express lines for a la carte products such as snacks and ice cream, in an attempt to reduce bottlenecks caused by full tray lunches that took longer to ring up.

The unintended results were disastrous.

“We were making it very convenient for them to quickly go through the line and get a bunch of less nutritious items,” she said.

After reviewing studies conducted by Wansink, the elementary schools renamed some of their healthier foods with names such as “x-ray vision carrots” and “lean, mean green beans”, and saw consumption increase.

Cafeteria workers also began changing how they interacted with the children, asking “would you rather have green beans or carrots today?” instead of waiting for a child to specifically request them.

They also found that simply asking a child whether they wanted a salad with their lunch during pizza day raised salad consumption 30 percent, Wansink said.

On the Net:

Apple iPads Available Through Walmart On Friday

Walmart will begin selling Apple’s iPad tablet computers online this Friday, and in more than 2,300 of its retail stores by mid-November.

“We’re pleased to give customers access to one of the most sought-after consumer electronics products this year,” a spokesman for the retailer said in an interview with the AFP news agency.

The Arkansas-based retail chain will sell the iPads for the same prices offered at Apple’s retail and online shops.  The lowest-priced iPad costs $499, while the top model is priced at $829.

Walmart rivals Best Buy and Target, along with online retailer Amazon.com, are already selling the iPad, which Apple launched in April.

Shares of Apple’s stock approached a record peak of $300 on Tuesday, rising $3.18 (1.08 percent), to close at $298.54.

The Cupertino, Calif.-based company sold 3.3 million iPads in the first quarter it was available, nearly triple the number of iPhones sold in the first full quarter they became available in 2007.

A few other factors are boosting investors’ sentiment for Apple.  The company is planning a big push in the Chinese market, with 25 retail stores expected to open there by the end of next year. There are also hopes the new lineup of iPods, which Apple unveiled in September, will be a popular during the upcoming holiday shopping season. 

Investors may also be anticipating that Verizon Wireless will soon get its own version of the iPhone next year.  The iPhone is currently available only on AT&T Inc.’s network within the United States. 

Apple is set to report its quarterly results next Monday. The company’s forecast is for net income of $3.44 per share, although Apple is well known for beating its own guidance by a comfortable margin. Analysts are expecting net income of $4.05 per share, representing a growth of about 46 percent.

On the Net:

Rates Of Blood Transfusions For CABG Surgery Varies Widely Among US Hospitals

A study that includes data on more than 100,000 patients who underwent coronary artery bypass graft surgery finds that there is wide variability among hospitals in the U.S. on the use of blood transfusions, without a large difference in the rate of death, suggesting that many transfusions may be unnecessary, according to a study in the October 13 issue of JAMA. Another study in this issue of JAMA examines the effect of a restrictive transfusion strategy on outcomes after cardiac surgery.

“Patients who undergo cardiac surgery receive a significant proportion of the 14 million units of allogeneic [taken from a different individual] red blood cells (RBCs) transfused annually in the United States,” the authors write. “Perioperative [around the time of surgery] blood transfusions are costly and have safety concerns. As a result, there have been multiple initiatives to reduce transfusion use. However, the degree to which perioperative transfusion rates vary among hospitals is unknown.”

Elliott Bennett-Guerrero, M.D., of Duke University Medical Center, Durham, N.C., and colleagues conducted a study to assess the use of RBC, fresh-frozen plasma, and platelet transfusions in coronary artery bypass graft (CABG) surgery in current practice, and to determine the degree to which transfusion practices vary among U.S. hospitals. The study included 102,470 patients undergoing CABG surgery during 2008 at 798 sites in the United States.

The researchers found significant variability in the observed hospital-specific transfusion rates for all 3 blood products among the patients and hospitals included in the study. To ensure that between-center differences would not be dominated by random statistical variation, the researchers also analyzed the subset of hospitals performing at least 100 eligible on-pump CABG operations during the year. At these 408 sites (n = 82,446 cases), the frequency of blood transfusion rates ranged from 7.8 percent to 92.8 percent for RBCs, 0 percent to 97.5 percent for fresh-frozen plasma, and 0.4 percent to 90.4 percent for platelets.

“Multivariate analysis including data from all 798 sites (102,470 cases) revealed that after adjustment for patient-level risk factors, hospital transfusion rates varied by geographic location, academic status, and hospital volume. However, these 3 hospital characteristics combined only explained 11.1 percent of the variation in hospital risk-adjusted RBC usage,” the authors write.

There was no significant association between hospital-specific RBC transfusion rates and all-cause mortality.

“As is the case in other areas of medicine, the degree of variability in clinical practice we observed represents a potential quality improvement opportunity. This is particularly complex in relation to transfusion practice in CABG surgery. The decision to transfuse has multiple triggers, resulting from a wide array of clinical scenarios and the consequent inability to apply standardized algorithms. The multiplicity of health care practitioners in CABG surgery care generates differences of opinion about safety and efficacy. Transfusion thresholds will change during the course of care; the threshold for a rapidly bleeding patient is different than for a stable patient postoperatively. Improvement in quality related to transfusion practice in CABG surgery is a multifactorial, complex but critically important, challenge,” the researchers write.

(JAMA. 2010;304[14]:1568-1575. Available pre-embargo to the media at www.jamamedia.org)

Editor’s Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Editorial: Blood Transfusion as a Quality Indicator in Cardiac Surgery

In an accompanying editorial, Aryeh S. Shander, M.D., of Englewood Hospital and Medical Center, Englewood, N.J., and Lawrence T. Goodnough, M.D., of the Stanford University School of Medicine, Stanford, Calif., write that this study provides a snapshot of transfusion practices in a subset of patients undergoing cardiac surgery across the United States.

“The data showing highly variable transfusion rates are disconcerting. Yet despite magnitudes of differences between hospitals in terms of RBC transfusion rates, there were no significant differences in mortality rates between the hospitals. The absence of differences in mortality among centers with varying transfusion rates strongly suggests inappropriate transfusions.”

(JAMA. 2010;304[14]:1610-16111. Available pre-embargo to the media at www.jamamedia.org)

Editor’s Note: Please see the article for additional information, including financial disclosures, funding and support, etc.

On the Net:

Long-Lasting Mechanical Heart Implanted For The First Time In Canada In Heart-Failure Patient

In a Canadian first, the Peter Munk Cardiac Centre used a new kind of left ventricular assist device (LVAD) to treat a patient with advanced heart failure. The new device is longer lasting than older generation LVADs and may eliminate the need for a second LVAD ““ a major drawback with the old technology.

The patient, 61-year-old Marva Lorde of Mississauga, suffered a heart attack in 2007 and underwent several treatments for heart failure ““ including a 10-day intensive care unit stay, angioplasty and pacemaker implantation ““culminating in a cardiac arrest in June 2008.

“I would have died in my sleep if the nurse hadn’t checked my heart monitor,” says Marva who is at home after receiving her LVAD on July 29, 2010. “Before the LVAD, I had trouble climbing the stairs to my bedroom, but now I’m back to my usual activities, including regular walks, and the stairs don’t give me trouble anymore.”

Marva’s LVAD, known as the DuraHeart, is designed for long-term cardiac support and reduces the risk of complications including strokes, infection and device failure ““ all of which can happen in mechanical heart devices. The device’s central pump is powered by magnetic levitation technology, which means its moving parts are held in place with magnets instead of bearings, eliminating wear and tear on the device. This technology enables blood to flow through the pump smoothly, which extends the life of the device and the life of the patient.

“With access to eight different types of cardiac assist devices, including the Duraheart, we have the most diverse array of circulatory support technology in Canada to best manage patients with end stage heart failure,” says Dr. Vivek Rao, Surgical Director of the Heart Transplant Program at the Peter Munk Cardiac Centre (PMCC), Toronto General Hospital.

Heart failure is the most common cause of hospitalization in North American adults and over 50,000 are treated for advanced heart failure annually. Transplantation is the only long-term treatment for end-stage heart failure patients and the long wait times for a matching donor organ make it necessary to find other alternatives.

Marva is currently on the heart transplant waiting list along with 36 other GTA residents, and her LVAD is her lifeline since her weak heart needs support until a new heart becomes available. This “bridge to life” is an innovative approach pioneered in Canada by Dr. Vivek Rao.

“Part of the centre’s mandate is to evaluate new technologies such as the Duraheart to help determine which patients are best suited to a specific device,” says Dr. Rao who’s team has implanted approximately 70 LVADs since 2004, giving hope to many patients and families.

The Duraheart device has been used in patients in Germany and the United States.

On the Net:

New Findings On Autoimmune Diseases

A deficiency in one of the immune system’s enzymes affects the severity of autoimmune diseases such as MS, and explains why the course of these diseases can vary so much. New findings give an insight into how this enzyme deficiency can be diagnosed, and could lead to new medicines, reveals a thesis from the Sahlgrenska Academy.

Multiple sclerosis (MS) and Guillain-Barr© syndrome (GBS) ““ the two autoimmune diseases covered by the thesis ““ can follow vastly different courses, with symptoms ranging from insignificant to life-threatening, the reason for which has been largely unknown. In the thesis the researchers have now found a factor in the immune defence that can explain this mechanism.

The immune system’s white blood cells play an important role in the fight against invading micro-organisms. They contain an enzyme called NADPH oxidase, which converts oxygen into reactive oxygen radicals. It has long been known that these oxygen radicals stop infections by breaking down micro-organisms. New studies using animal models have shown that inadequate production of oxygen radicals can lead to the development of autoimmune diseases, where a patient’s immune system attacks the body’s own tissues. This would indicate that oxygen radicals are important for preventing the occurrence of autoimmune diseases.

“We wanted to look at this in humans, and examined the NADPH oxidase in the white blood cells of patients with MS, GBS and recurring GBS (RGBS),” says Natalia Mossberg, doctoral student at the Institute of Neuroscience and Physiology at the Sahlgrenska Academy. “The results show that patients with more severe forms of the illness have lower levels of oxygen radical production in their white blood cells as a result of deficient NADPH oxidase function.”

The researchers discovered that the body’s ability to produce reactive oxygen radicals at an early stage in the immune defence against infections has a major impact on how these illnesses develop. “We’ve shown that a strong but controlled production of oxygen radicals by the immune system is important for subduing illnesses such as MS and GBS,” says Mossberg.

The researchers think that this method of measuring oxygen radical production in white blood cells can be used for investigating other autoimmune diseases and for diagnosing the severity of these illnesses. The discovery could also lead to a new approach to the treatment of MS in its early stages with medicines that trigger the production of NADPH oxidase or a vaccination for people at risk of developing this type of illness.

On the Net:

Physical Symptoms Prevalent No Matter What Stage Of Cancer Including Remission

Twenty-two physical symptoms associated with cancer ““ symptoms often unrecognized and undertreated ““ are prevalent in all types of cancers regardless of whether the patient is newly diagnosed, undergoing treatment or is a cancer survivor, according to researchers from the Regenstrief Institute and the Indiana University schools of medicine and nursing.

Common symptoms include fatigue, pain, weakness, appetite loss, dry mouth, constipation, insomnia and nausea. These physical symptoms are associated with substantial functional impairment, disability and diminished quality of life.

The study of 405 patients was reported in the Oct. 11, 2010, issue of the Archives of Internal Medicine. Numerous physical symptoms, rather than just a few, were prevalent in patients with cancer and this prevalence did not diminish after completion of therapy.

“We found that regardless of where they are in the course of their diseases, many individuals with cancer have a high symptom burden,” said Kurt Kroenke, M.D., the study’s principal investigator and first author. Dr. Kroenke is a Regenstrief Institute investigator and a Chancellor’s Professor of Medicine in the IU School of Medicine.

“These symptoms impact them at home and at work throughout their lives,” he said.

Study participants, all of whom had pain, depression or both, experienced substantial disability, reporting on average 17 of the past 28 days as either bed days or days in which they had to cut down on activities by at least 50%. Almost all patients reported feeling tired (97.5%) and most (78.8%) were bothered “a lot” by this symptom. Of the 22 symptoms studied, 15 were reported by more than half of the study participants.

In spite of high symptom prevalence, the researchers did not uncover greater use of the health care system. There may be several explanations for this including patients’ inclinations to focus on cancer treatment while with their physicians or to accept the symptoms as an inevitable result of the disease or its treatment. Alternatively, the explanation may lie with the fact that those in the study, as cancer patients or former patients, were already frequently interacting with many parts of the health care system.

“Patients and their families should be encouraged to bring up symptoms like pain or insomnia with physicians. But because oncologists are necessarily focused on treatment of the cancer itself, they often have insufficient time to optimally evaluate and manage symptoms and other factors impacting quality of life. We have shown in an earlier study that one effective solution might be a partnership between a telephone-based symptom management team and community-based oncology practices,,” said Dr. Kroenke, who is a research scientist with the Center for Implementing Evidence-Based Practice at the Richard Roudebush VA Medical Center and an Indiana University Melvin and Bren Simon Cancer Center member.

The previous study, published earlier in 2010 in the Journal of the American Medical Association, reported that an economical, centralized approach is feasible to conduct and significantly improved symptoms of pain and depression in patients in any phase of cancer. That approach gave patients, many of whom lived in underserved rural areas, one-stop assistance they probably wouldn’t have had access to unless they went to a major cancer center, Kroenke said.

Recognizing and managing physical symptoms such as fatigue, pain, nausea, and insomnia may make a significant difference regardless of type or phase of cancer. The researchers plan to investigate medical and behavioral strategies and combinations of both approaches to control these symptoms.

On the Net:

Researchers Confirm Black Death Killer Bacteria

Black Death, one of the deadliest pandemics in human history, has been confirmed by anthropologists to have been caused by a germ called Yersinia pestis.

Researchers studied tooth and bone samples from 76 skeletons discovered in “plague pits” in France, Germany, Italy and the Netherlands and found DNA evidence that Y. pestis was to blame for the plague that wiped out nearly a third of Europe’s population during the Middle Ages.

It was believed for more than a century that Y. pestis was the source of the so-called Black Death, which plagued Europe from the 1300s to the 1700s and possibly early 1800s. But scientific data to prove the bacterium was the actual culprit has been sketchy at best.

Because of this, many rival theories have come about, including opinions that an Ebola-style virus or the anthrax germ was to blame.

The new evidence also sheds light on the geographical route the germ took, which is believed to have originated in central or southern Asia before making its way into Europe through trade routes.

In the samples that researchers found Y. pestis genes, they ran a test for 20 DNA markers to identify a particular bacterial strain.

They wanted to know if it matched either of the other two Y. pestis strains that are floating around today, mostly in Africa, America, the Middle East and Russia. But neither of the two modern types, Orientalis and Medievalis, showed up.

Instead, two previously unknown types were found, both of them older than today’s strains and different from each other.

“The history of this pandemic is much more complicated than we had previously thought,” said Stephanie Haensch, a co-leader of the research, at Johannes Gutenberg University in Mainz, Germany.

Y. pestis showed up in Western Europe in November 1347, believed to be driven by fleas living on rats which made it to land from a merchant ship docked at the Mediterranean French port of Marseille.

After making landfall, it took six years for it to spread through western France to northern France and then over to England, again through commerce.

However, a different strain was discovered in a mass burial site in Bergen op Zoom, a port in the southern Netherlands, suggesting that Y. pestis also made entry from the north, perhaps from Norway via the Dutch province Friesland.

After its initial surge in 1347, the pandemic progressively spread around Europe, reducing the population by as much as 60 percent throughout its 4 century run, and causing far-reaching social and political impacts.

The findings were published in the open-access journal PLoS Pathogens.

Image Caption: A scanning electron microscope micrograph depicting a mass of Yersinia pestis bacteria. Credit: Rocky Mountain Laboratories, NIAID, NIH

On the Net:

The Largest National Study on Sexual Behavior

(Ivanhoe Newswire) — A recent study pulls down the covers on contemporary American’s sexual behaviors.  The findings — with reference to 5,865 adolescent and adults ages 14 to 94 — include a description of more than 40 combinations of sexual acts that people perform when the lights go out, patterns of condom use by adolescence and adults, and the percentage of Americans participating in same-sex encounters.

The National Survey of Sexual Health and Behavior (NSSHB), one of the most inclusive studies on sexual behavior in roughly two decades, was conducted by researchers from the Center for Sexual Health Promotion (CSHP) in Indiana University’s School of Health, Physical Education, and Recreation (HPER).

“This survey is one of the most expansive nationally representative studies of sexual behavior and condom use ever conducted, given the 80-year span of ages,” which Michael Reece, director of the Center for Sexual Health Promotion, was quoted as saying.  “These data about sexual behaviors and condom use in contemporary America are critically needed by medical and public health professionals who are on the front lines addressing issues such as HIV, sexually transmissible infections and unintended pregnancy.”
The initial findings of the survey show that one of four acts of vaginal intercourse are condom protected in the United States (one in three among singles).

“These data, when compared to other studies in the recent past, suggest that although condom use has increased among some groups, efforts to promote the use of condoms to sexually active individuals should remain a public health priority,” Reece added.

“People are often curious about others’ sex lives,” which Debby Herbenick, associate director of the CSHP, was quoted as saying. “They want to know how often men and women in different age groups have sex, the types of sex they engage in, and whether they are enjoying it or experiencing sexual difficulties. Our data provide answers to these common sex questions and demonstrate how sex has changed in the nearly 20 years since the last study of its kind.”

The study helps promote awareness apropos patterns of condom use across different stages of people’s relationships and across ages for both the public and professionals alike.  Herbenick added that these “findings show that condoms are used twice as often with casual sexual partners as with relationship partners, a trend that is consistent for both men and women across age groups that span 50 years.”

Furthermore, there is vast variability in the sexual repertoires of the U.S. adults now, and adult men and women seldom engage in merely one sexual act when they have sex.  When it comes to sex, responsibility is just as important as pleasure is.  Condom use was reported being highest among African-Americans as well as Hispanic-Americans, compared to Caucasians and other racial groups.

“Many surveys of adolescent sexual behavior create an impression that adolescents are becoming sexually active at younger ages, and that most teens are sexually active,” which Dennis Fortenberry, M.D., professor of pediatrics in the IU School of Medicine, was quoted as saying.  “Our data show that partnered sexual behaviors are important but by no means pervasive aspects of adolescents’ lives.  In fact, many contemporary adolescents are being responsible by abstaining or by using condoms when having sex.”

Additional key findings highlighted in the collection of papers include:

“¢ There is enormous variability in the sexual repertoires of U.S. adults, with more than 40 combinations of sexual activity described at adults’ most recent sexual event.

“¢ Many older adults continue to have active pleasurable sex lives, reporting a range of different behaviors and partner types; however adults over the age of 40 have the lowest rates of condom use. Although these individuals may not be as concerned about pregnancy, this suggests the need to enhance education efforts for older individuals regarding STI risks and prevention.

“¢ About 85 percent of men report that their partner had an orgasm at the most recent sexual event; this compares to the 64 percent of women who report having had an orgasm at their most recent sexual event. (A difference that is too large to be accounted for by some of the men having had male partners at their most recent event.)

“¢ While about 7 percent of adult women and 8 percent of men identify as gay, lesbian or bisexual, the proportion of individuals in the U.S. who have had same-gender sexual interactions at some point in their lives is higher.

“¢ At any given point in time, most U.S. adolescents are not engaging in partnered sexual behavior. While 40 percent of 17 year-old males reported vaginal intercourse in the past year, only 27 percent reported the same in the past 90 days.

“¢ Adults using a condom for intercourse were just as likely to rate the sexual extent positively in terms of arousal, pleasure and orgasm than when having intercourse without one.

SOURCE: The Journal of Sexual Medicine, October 2010

Unemployment Linked to Abuse of Children

(Ivanhoe Newswire) ““ Poverty and the stresses that come along with it have long been correlated to neglect and child abuse, but this study shows that unemployment leads to maltreatment of children only one year later. With the increasing rates of unemployment in the U.S. it is important to not let the stress of life affect innocent children.

The researchers reviewed state-level unemployment statistics from the Bureau of Labor Statistics, and compared them with child maltreatment data from the National Child Abuse and Neglect Data System (NCANDS), during the years 1990 to 2008. Each 1 percent increase in unemployment was associated with at least a 0.50 per 1,000 increase in confirmed child maltreatment reports one year later. In addition, higher levels of unemployment appeared to raise the likelihood of child maltreatment, as it was not only the lagged change in unemployment, but also the previous year’s unemployment level that influenced the number of child abuse cases.

According to the study, the rise in unemployment lasting for years, not only affects the economy of our country but also the physical and mental health of children. These children suffer the immediate physical and emotion abuse associated with a lost job. These kids have to deal with physical injury, mental and emotional abuse, neglect, and sometimes even death. It has been proven that these children and at an increased risk of physical and mental health effects that usually last their entire lives.

“When times are bad, children suffer,” study author Robert Sege, MD, PhD, FAAP, professor of pediatrics, Boston University School of Medicine, and director, Division of Ambulatory Pediatrics, Boston Medical Center was quoted as saying. “These results suggest that programs to strengthen families and prevent maltreatment should be expanded during economic downturns.”

SOURCE: The American Academy of Pediatrics (AAP) National Conference and Exhibition, held in San Francisco, October 2010

Vitamin D Deficiency Rampant In Patients Undergoing Orthopedic Surgery, Damaging Patient Recovery

Doctors provide strategy to improve outcomes

Almost 50 percent of patients undergoing orthopedic surgery have vitamin D deficiency that should be corrected before surgery to improve patient outcomes, based on a study by researchers at Hospital for Special Surgery (HSS) in New York City. Vitamin D is essential for bone healing and muscle function and is critical for a patient’s recovery. The study appears in the October issue of The Journal of Bone and Joint Surgery.

“In the perfect world, test levels, fix and then operate,” said Joseph Lane, M.D., professor of Orthopedic Surgery and chief of the Metabolic Bone Disease Service at HSS, who led the study. “If you put people on 2,000-4,000 [milligrams] of vitamin D based on what their deficient value was, you can usually get them corrected in four to six weeks, which is when you are really going to need the vitamin D. If you are really aggressive right before surgery, you can correct deficient levels quickly, but you have to correct it, measure it, and then act on it.”

According to Dr. Lane, bone remodeling or bone tissue formation, a part of the healing process, occurs about two to four weeks after surgery. This is the critical stage when your body needs vitamin D.

For their study, investigators conducted a retrospective chart review of 723 patients who were scheduled for orthopedic surgery between January 2007 and March 2008 at HSS. They examined the vitamin D levels, which had been measured in all patients before their surgery, and found that 43 percent had insufficient vitamin D and 40 percent had deficient levels.

Vitamin D inadequacy was defined as <32 ng/mL, vitamin D insufficiency was defined as 20 to <32 ng/mL, and vitamin D deficiency was defined at <20 ng/mL. Problems were more prevalent in younger patients, men and individuals with dark skin””blacks and Hispanic.

The highest levels of deficiency were seen in patients in the trauma service, where 66 percent of patients had insufficient levels and 52 percent had deficient levels. Of the patients undergoing foot and ankle surgery, 34 percent had inadequate levels and of patients undergoing hand surgery, 40 percent had insufficient levels.

In the Sports Medicine Service, 52.3 percent had insufficient levels and of these, one-third of these or 17 percent of the total had deficient levels. “We frequently see stress fractures in the Sports Medicine Service and if you want to heal, you have to fix the calcium and vitamin D,” Dr. Lane said.

In the Arthroplasty Service, which conducts hip and knee replacements, 38 percent had inadequate levels and 48 percent had deficient levels. “With arthroplasty, there is a certain number of patients that when you put in the prothesis, it breaks the bone adjacent to the protheses, which can really debilitate patients.” This could be prevented or minimized by rectifying vitamin D levels. Dr. Lane also explained that they now perform procedures where they grow a bone into a prosthesis without using cement. “In those people, it would be an advantage to have adequate vitamin D, because it matures the bone as it grows in, it is really healing into the prosthesis,” he said.

“The take home message is that low vitamin D has an implication in terms of muscle and fracture healing, it occurs in about 50 percent of people coming in for orthopedic surgery, and it is eminently correctable,” Dr. Lane said. “We recommend that people undergoing a procedure that involves the bone or the muscle should correct their vitamin D if they want to have an earlier faster, better, result. What we are saying is ‘wake up guys, smell the coffee; half of your patients have a problem, measure it, and if they are low, then fix it.'”

In recent years, vitamin D deficiency has been recognized as a common phenomenon and is caused by many factors. It is difficult to get from foods, except, for example, cod liver oil and fish. Until recently, the recommended daily allowance was set too low so foods were not supplemented with adequate doses. And third, while people can absorb vitamin D from sunlight, people these days often work long hours and often use sunscreen that impedes vitamin D intake.

On the Net:

U.S. Life Expectancy Lowered By Poor Health Care

Researchers from Columbia University have found that the life expectancy of Americans is falling behind those of other countries, and the reason why might come as a bit of a surprise.

It’s not obesity rates, or smoking, or traffic accidents, or even homicides. Rather, the primary culprit, according to Peter Munning and Sherry Glied of the university’s Mailman School of Public Health, is the poor quality of health care in the United States.

In the Commonwealth Fund-sponsored study, Muennig and Glied studied both health spending, a wide array of different behavioral risk factors, and 15-year survival rates for both men and women, ages 45 to 65, in the U.S., Australia, Austria, Belgium, Canada, France, Germany, Italy, Japan, the Netherlands, Sweden, Switzerland, and the United Kingdom.

They found that, despite the fact that the U.S. has achieved gains in survival rates each decade from 1975 through 2005, many of the other countries are outperforming them in those gains–despite the fact that American per capita health care spending has increased twice as rapidly as those other nations.

The life expectancy for 45-year-old American males dropped from third in 1975 to 12th in 2005, but according to an October 7 press release from the Commonwealth Fund, a private foundation supporting independent research on health policy reform, “Forty-five year old U.S. white women fared the worst–by 2005 their 15-year survival rates were lower than that of all the other countries. Moreover, the survival rates of this group in 2005 had not even surpassed the 1975 15-year survival rates for Swiss, Swedish, Dutch or Japanese women.”

“It was shocking to see the U.S. falling behind other countries even as costs soared ahead of them,” Muennig, an assistant professor at Columbia and the lead author of a paper that has been published in Health Affairs, said in a statement. “But what really surprised us was that all of the usual suspects”¦ are not the culprits.”

“The U.S. doesn’t stand out as doing any worse in these areas than any of the other countries we studied, leading us to believe that failings in the U.S. health care system, such as costly specialized and fragmented care, are likely playing a large role in this relatively poor performance on improvements in life expectancy,” he added.

According to Reuters Health and Science Editor Maggie Fox, American citizens spend an average of nearly $7,300 per person annually–twice as much as residents of other developed nations–but tend to receive “lower quality and less efficiency” in exchange. Each of the 12 other nations studied in the survey provided universal health care for their people, Fox reports.

On the Net:

From Eye To Brain

Salk researchers map functional connections between retinal neurons at single-cell resolution

By comparing a clearly defined visual input with the electrical output of the retina, researchers at the Salk Institute for Biological Studies were able to trace for the first time the neuronal circuitry that connects individual photoreceptors with retinal ganglion cells, the neurons that carry visuals signals from the eye to the brain.

Their measurements, published in the Oct. 7, 2010, issue of the journal Nature, not only reveal computations in a neural circuit at the elementary resolution of individual neurons but also shed light on the neural code used by the retina to relay color information to the brain.

“Nobody has ever seen the entire input-output transformation performed by complete circuits in the retina at single-cell resolution,” says senior author E.J. Chichilnisky, Ph.D., an associate professor in the Systems Neurobiology Laboratories. “We think these data will allow us to more deeply understand neuronal computations in the visual system and ultimately may help us construct better retinal implants.”

One of the essential elements that made the experiments possible was the unique neural recording system developed by an international team of high-energy physicists from the University of California, Santa Cruz; the AGH University of Science and Technology, Krakow, Poland; and the University of Glasgow, UK. This system is able to record simultaneously the tiny electrical signals generated by hundreds of the retinal output neurons that transmit information about the outside visual world to the brain. These recordings are made at high-speed (over ten million samples each second) and with fine spatial detail, sufficient to detect even a locally complete population of the tiny and densely spaced output cells known as “midget” retinal ganglion cells.

Retinal ganglion cells are classified based on their size, the connections they form, and their responses to visual stimulation, which can vary widely. Despite their differences, they all have one thing in common-a long axon that extends into the brain and forms part of the optic nerve.

Visual processing begins when photons entering the eye strike one or more of the 125 million light-sensitive nerve cells in the retina. This first layer of cells, which are known as rods and cones, converts the information into electrical signals and sends them to an intermediate layer, which in turn relays signals to the 20 or so distinct types of retinal ganglion cells.

In an earlier study, Chichilnisky and his team found that each type of retinal ganglion cells forms a seamless lattice covering visual space that transmits a complete visual image to the brain. In the current study, postdoctoral researcher and co-first author Greg D. Field, Ph.D., and his collaborators zoomed in on the pattern of connectivity between these layers of retinal ganglion cells and the full lattice of cone receptors.

The Salk researchers simultaneously recorded hundreds of retinal ganglion cells, and based on density and light response properties, identified five cell types: ON and OFF midget cells, ON and OFF parasol cells, and small bistratified cells, which collectively account for approximately 75 percent of all retinal ganglion cells.

To resolve the fine structure of receptive fields-the small, irregularly shaped windows through which neurons in retina view the world-the authors used stimuli with tenfold smaller pixels. “Instead of a diffuse region of light sensitivity, we detected punctate islands of light sensitivity separated by regions of no light sensitivity,” he says.

When combined with information on spectral sensitivities of individual cones, maps of these punctate islands not only allowed the researchers to recreate the full cone mosaic found in the retina, but also to conclude which cone fed information to which retinal ganglion cell.

“Just by stimulating input cells and taking a high density recording from output cells, we can identify all individual input and output cells and find out who is connected to whom,” says Chichilnisky.

Chichilnisky and his team discovered that populations of ON and OFF midget and parasol cells each sampled the complete population of cones sensitive to red or green light, with midget cells sampling these cones in a surprisingly non-random fashion. Only OFF midget cells frequently received strong input from cones sensitive to blue light.

The research was funded in part by the Helen Hay Whitney Foundation, the German Research Foundation, the National Institutes of Health, the Chapman Foundation, the Miller Institute for Basic Research in Science, the Polish Ministry of Science and Higher Education, the Burroughs Wellcome Trust, the McKnight Foundation, the National Science Foundation, the Sloan Foundation, the Engineering and Physical Sciences Research Council and The Royal Society of Edinburgh.

Researchers who also contributed to the work include co-first author Jeffrey L. Gauthier, Ph.D., Martin Greschner, Timothy A. Machado, Lauren H. Jepson, and Jonathon Shlens in the Systems Neurobiology Laboratory at the Salk Institute, co-first author Alexander Sher and Alan Litke at the Santa Cruz Institute for Particle Physics at the University of California, Santa Cruz, Deborah E. Gunning and Keith Mathieson in the Department of Physics and Astronomy at the University of Glasgow, Wladyslaw Dabrowski at the Faculty of Physics and Applied Computer Science at the AGH University of Science and Technology in Krakow, and Liam Paninski in the Department of Statistics and Center for Theoretical Neuroscience at Columbia University, New York.

Image 1: Being able to record from hundreds of retinal ganglion cells, E.J. Chichilnisky and his team could trace the connections of individual photoreceptors (red, green and blue dots) to individual retinal ganglion cells. Credit: Dr. E.J. Chichilnisky and the journal Nature

Image 2: A unique neural recording system developed by an international team of high energy physicists, which is able to record simultaneously the tiny electrical signals generated by hundreds of the retinal output neurons, is one of the essential elements of the study. Recording electrodes are shown in the foreground and retinal ganglion cells in the background. Credit: Dr. E.J. Chichilnisky, Salk Institute for Biological Studies

Image 3: Photoreceptors in the retina convert visual information into electrical signals and send them to an intermediate layer, which in turn relays signals to the 20 or so distinct types of retinal ganglion cells. Credit: Jamie Simon, Salk Institute for Biological Studies

On the Net:

2009 H1N1 Pandemic — What Went Right And What Went Wrong?

In this week’s PLoS Medicine, Gabriel Leung from the Government of the Hong Kong SAR and Angus Nicoll from the European Centre for Disease Prevention and Control offer their reflections on the international response to the 2009 H1N1 influenza pandemic, including what went well and what changes need to be made on the part of global and national authorities in anticipation of future flu pandemics.

On the Net:

Over 200 New Species Found In Papua New Guinea

Scientists found over 200 new species in the Pacific Islands of Papua New Guinea, including a long-snouted frog and a white-tailed mouse.

Scientists found an exciting assortment of new mammals, amphibians, insects and plants through a survey of remote New Britain island and the Southern Highlands ranges.

“To find a completely new genus of mammal in this day and age is pretty cool,” said lead researcher Steve Richards of the new mouse species discovery.

“I mean, people have heard of birds of paradise and tree-climbing kangaroos and stuff, but when you look even closer at the small things you just realize that there’s a staggering diversity out there that we really know nothing about,” he told AFP.

Papua New Guinea’s jungles are one of three wild rainforest areas left in the world, along with the Amazon and the Congo basin.  Richards told AFP that the islands were a vast “storehouse” of biodiversity, with scores of new species found by his Conservation International team.

He said that the “very, very beautiful mouse,” the 0.8-inch long-snouted frog and another with white bright yellow spots were among the highlights of the discovery. 

“I would say that pretty much no matter where you go in New Guinea you’re guaranteed to pick up new or poorly known spectacular species,” Richards, an expert in frogs and reptiles who is based in Cairns, Australia, told AFP.

“For some lesser known groups only half of the things that we document actually have names, we aren’t even a fraction of the way there,” he added.

Biologists were unable to even enter some areas due to the mountainous region.

Richards said there were “large areas of New Guinea that are pretty much unexplored biologically.”

He said that samples were taken of a number of species and genetic testing had confirmed that it was not related to any known creature.

“These kind of discoveries are almost kind of a good news story amongst all the gloom,” he told AFP, referring to the creeping extinction of other creatures.

“There really are spectacular species still out there and there really is a potential for things to survive.”

On the Net:

New Graphene Fabrication Method Uses Silicon Carbide Templates To Create Desired Growth

Smoothing the edges

Researchers at the Georgia Institute of Technology have developed a new “templated growth” technique for fabricating nanometer-scale graphene devices. The method addresses what had been a significant obstacle to the use of this promising material in future generations of high-performance electronic devices.

The technique involves etching patterns into the silicon carbide surfaces on which epitaxial graphene is grown. The patterns serve as templates directing the growth of graphene structures, allowing the formation of nanoribbons of specific widths without the use of e-beams or other destructive cutting techniques. Graphene nanoribbons produced with these templates have smooth edges that avoid electron-scattering problems.

“Using this approach, we can make very narrow ribbons of interconnected graphene without the rough edges,” said Walt de Heer, a professor in the Georgia Tech School of Physics. “Anything that can be done to make small structures without having to cut them is going to be useful to the development of graphene electronics because if the edges are too rough, electrons passing through the ribbons scatter against the edges and reduce the desirable properties of graphene.”

The new technique has been used to fabricate an array of 10,000 top-gated graphene transistors on a 0.24 square centimeter chip ““ believed to be the largest density of graphene devices reported so far.

The research was reported Oct. 3 in the advance online edition of the journal Nature Nanotechnology. The work was supported by the National Science Foundation, the W.M. Keck Foundation and the Nanoelectronics Research Initiative Institute for Nanoelectronics Discovery and Exploration (INDEX).

In creating their graphene nanostructures, De Heer and his research team first use conventional microelectronics techniques to etch tiny “steps” ““ or contours ““ into a silicon carbide wafer. They then heat the contoured wafer to approximately 1,500 degrees Celsius, which initiates melting that polishes any rough edges left by the etching process.

They then use established techniques for growing graphene from silicon carbide by driving off the silicon atoms from the surface. Instead of producing a consistent layer of graphene one atom thick across the surface of the wafer, however, the researchers limit the heating time so that graphene grows only on the edges of the contours.

To do this, they take advantage of the fact that graphene grows more rapidly on certain facets of the silicon carbide crystal than on others. The width of the resulting nanoribbons is proportional to the depth of the contour, providing a mechanism for precisely controlling the nanoribbons. To form complex graphene structures, multiple etching steps can be carried out to create a complex template, de Heer explained.

“By using the silicon carbide to provide the template, we can grow graphene in exactly the sizes and shapes that we want,” he said. “Cutting steps of various depths allows us to create graphene structures that are interconnected in the way we want them to be.

In nanometer-scale graphene ribbons, quantum confinement makes the material behave as a semiconductor suitable for creation of electronic devices. But in ribbons a micron or more wide, the material acts as a conductor. Controlling the depth of the silicon carbide template allows the researchers to create these different structures simultaneously, using the same growth process.

“The same material can be either a conductor or a semiconductor depending on its shape,” noted de Heer, who is also a faculty member in Georgia Tech’s National Science Foundation-supported Materials Research Science and Engineering Center (MRSEC). “One of the major advantages of graphene electronics is to make the device leads and the semiconducting ribbons from the same material. That’s important to avoid electrical resistance that builds up at junctions between different materials.”

After formation of the nanoribbons ““ which can be as narrow as 40 nanometers ““ the researchers apply a dielectric material and metal gate to construct field-effect transistors. While successful fabrication of high-quality transistors demonstrates graphene’s viability as an electronic material, de Heer sees them as only the first step in what could be done with the material.

“When we manage to make devices well on the nanoscale, we can then move on to make much smaller and finer structures that will go beyond conventional transistors to open up the possibility for more sophisticated devices that use electrons more like light than particles,” he said. “If we can factor quantum mechanical features into electronics, that is going to open up a lot of new possibilities.”

De Heer and his research team are now working to create smaller structures, and to integrate the graphene devices with silicon. The researchers are also working to improve the field-effect transistors with thinner dielectric materials.

Ultimately, graphene may be the basis for a generation of high-performance devices that will take advantage of the material’s unique properties in applications where the higher cost can be justified. Silicon will continue to be used in applications that don’t require such high performance, de Heer said.

“This is another step showing that our method of working with epitaxial graphene on silicon carbide is the right approach and the one that will probably be used for making graphene electronics,” he added. “This is a significant new step toward electronics manufacturing with graphene.”

On the Net:

Bricks Made With Wool

Spanish and Scottish researchers have added wool fibers to the clay material used to make bricks and combined these with an alginate, a natural polymer extracted from seaweed. The result is bricks that are stronger and more environmentally-friendly, according to the study published recently in the journal Construction and Building Materials.

“The objective was to produce bricks reinforced with wool and to obtain a composite that was more sustainable, non-toxic, using abundant local materials, and that would mechanically improve the bricks’ strength”, Carmen Galán and Carlos Rivera, authors of the study and researchers at the Schools of Architecture in the Universities of Seville (Spain) and Strathclyde (Glasgow, United Kingdom), tell SINC.

The wool fibers were added to the clay material used in the bricks, using alginate conglomerate, a natural polymer found in the cell walls of seaweed. The mechanical tests carried out showed the compound to be 37% stronger than other bricks made using unfired stabilized earth.

The study, which has been recently published in the Journal Construction and Building Materials, was the result of close collaboration between the British and Spanish universities. The clay-based soils were provided by brick manufacturers in Scotland, which was also the source of the wool, since the local textile industry cannot use everything it produces. “The aim was to produce a material suitable for adverse climatic conditions, such as the specific ones in the United Kingdom”, the authors explain.

Advantages of environmentally-friendly bricks

The researchers studied the effect of reinforcing various soil types with sheep’s wool, and arrived at various conclusions. “These fibers improve the strength of compressed bricks, reduce the formation of fissures and deformities as a result of contraction, reduce drying time and increase the bricks’ resistance to flexion”.

This piece of research is one of the initiatives involved in efforts to promote the development of increasingly sustainable construction materials. These kinds of bricks can be manufactured without firing, which contributes to energy savings. According to the authors: “This is a more sustainable and healthy alternative to conventional building materials such as baked earth bricks and concrete blocks”.

Untreated clay was one of the earliest building materials to be used by humankind. The oldest examples of this can be found in houses in the Near East dating from between 11,000 and 12,000 years ago. Earthy material mixed with plants and pebbles to make them stronger has also been found in certain archaeological deposits from 1400BCE in Sardinia (Italy).

References:

C. Galán-Marín, C. Rivera-G³mez y J. Petric. “Clay-based composite stabilized with natural polymer and fiber”. Construction and Building Materials 24(8): 1462, agosto de 2010.

Galán-Marín, C.; Rivera-G³mez, C.; Petric-Gray, J. “Effect of Animal Fibers Reinforcement on Stabilized Earth Mechanical Properties”. Journal of Biobased Materials and Bioenergy 4(2): 121-128, junio de 2010.

Image 1: These are bricks made with wool. Credit: Galán-Marín et al.

Image 2: This machine is making bricks using wool. Credit: Galán-Marín et al.

On the Net:

Scientists Develop New Botox Treatment For Pain

British scientists have developed a new way of joining and rebuilding molecules to refine the anti-wrinkle treatment Botox and improve its use for Parkinson’s, cerebral palsy, and chronic migraine.

Researchers at the Medical Research Council’s Laboratory of Molecular Biology said their results opens up new ways to help develop new types of Botox, which may be used as long-term painkillers.

“It will now be possible to produce Botox-based medicines in a safer and more economical way,” Bazbek Davletov, who led the study, said in a statement about his findings.

The researchers said in a report of work in the Proceedings of the National Academy of Sciences (PNAS) journal that by breaking down Botox molecules into two separate buildings blocks, the team was able to produce them separately and safely, and then “clip” them back together again.

They said the new clipping method produced a refined Botox-like molecule, which could eventually be a practical use without the unwanted toxic effects.

Clostridium botulinum neurotoxin, commonly known as Botox, has been used increasingly as a medical treatment, in which doctors exploit its ability to relax muscles and nerves to try and still spasms and tremors like those in patients with Parkinson’s disease.

Britain became the first country to approve the drug as a treatment for migraine in July.

However, the substance is extremely toxic and can only be used in a diluted form, a factor that limits its development for other uses.

Davletov said the new refining technique could allow scientists to produce new forms of Botox with wider practical medicinal uses.

“This is the first time we have been able to treat protein molecules like Lego building blocks, mixing and matching them to create the basis for treatments that would not previously have been possible,” he said in a statement.

He said that the method could potentially allow researchers to develop a form of chronic pain relief, which could last as long as a single Botox injection.

On the Net:

Vaccines For Smallpox And Anthrax Combined

US government researchers announced Monday they have combined smallpox and anthrax vaccines into one vaccine, improving them and making them safer, faster-acting and more effective in case of a biological attack.

“Although licensed vaccines are available for both smallpox and anthrax, because of inadequacies associated with each of these vaccines, serious concerns remain as to the deployability of these vaccines, especially in the aftermath of a bioterror attack involving these pathogens,” wrote Liyanage Perera of the US National Cancer Institute and colleagues.

The researchers said the new dual vaccine can be freeze-dried, stockpiled and quickly delivered when it is needed.

Perera and colleagues at the US Food and Drug Administration along with small biotech firm JDM Technologies in Maryland, had reported earlier the improvements to Wyeth’s DryVax vaccine against smallpox.

Smallpox was wiped out in 1979 after a persistent global vaccination campaign. But some experts believe that the former Soviet Union had developed smallpox viruses into biological weapons and that they could have made their way into enemy hands.

Anthrax can also be developed into a deadly weapon.

In the weeks and months following the 9-11 attacks on the World Trade Center and the Pentagon, five people died and more than a dozen were injured when someone — or some group — mailed letters that had carried the deadly anthrax spores. A government researcher at Fort Detrick in Maryland was the chief suspect but he committed suicide before he could be charged.

The current vaccine for smallpox has potentially deadly side-effects. It is not used in the general population and people born after 1972 are very unlikely to have been vaccinated.

The current anthrax vaccine does not work well and several companies are working on alternative solutions, some with multimillion-dollar contracts from the US government. Both vaccines are given to the US military.

Perera and colleagues adjusted the current smallpox vaccine by adding an immune system compound called interleukin-15, and genetically altered the virus used to make it. They added one protein from the anthrax bacteria to make the dual vaccine.

The new dual vaccine was tested in rabbits and mice and the test results suggest the combined vaccine would protect better than either of the older ones alone, they said.

Antibiotics made for anthrax can protect people from the deadly germ, but only if they are taken quickly. The spores can live in the body for weeks and months before becoming active.

“We believe our dual vaccine, Wyeth/IL-15/PA, which is effective against two of the most deadly pathogens, will help consolidate and simplify our national bioterror counter efforts by streamlining the manufacture, stockpiling, and swift deployment of such vaccines should the need arise,” Perera and colleagues concluded.

The vaccine requires federal approval before it could be developed for use in people. The researchers were not immediately available for comment on when they would apply for approval.

The researchers wrote about their work in the Proceedings of the National Academy of Sciences.

On the Net:

Scientists Collaborate On Quantum Physics Research Center

Scientists from Germany and Canada announced on Monday a partnership to establish a research center for the study of quantum physics.

The new facility, dubbed the Max Planck-UBC Centre for Quantum Materials, will reside at the University of British Columbia in Vancouver, and will be funded by Germany’s Max Planck Society, a prestigious research institution and home to 32 Nobel prizes.

The funding agreement also commits both institutions to conducting joint research projects in Canada and Germany, and to increasing scholarly exchanges, according to a press release.  

The announcement also marks the beginning of the Max Planck Society-UBC “Summer School” on Quantum Materials involving five lecturers and 10 graduate students and post-doctoral fellows from UBC and a similar number of participants from Germany.

“Today’s agreement represents a joining of great strengths within both the Max Plank Society and UBC and will provide the underpinning for future research in advanced materials science,” said UBC President Stephen Toope.

“The knowledge and discoveries generated from these collaborations will profoundly change the lives of present and future generations.”

The new research center will be the third such institute that the Max Planck Society funds outside of Germany, joining centers in India, Spain and a fourth under construction in Florida.

“The partnership with Max Planck is a testament to the caliber of research conducted here, and our researchers enjoy reputations as some of the most internationally collaborative in the world,” said John Hepburn, UBC Vice President Research and International.

“Our interdisciplinary research strengths are further complemented by state-of-the-art facilities such as UBC’s Advanced Materials and Process Engineering Laboratory, our vicinity to TRIUMF, Canada’s National Laboratory for Particle and Nuclear Physics, and priority access to the Canadian Light Source Synchrotron.”

Teens More Likely Than Adults To Use Condoms

Only four out of ten older teenage boys have reported having sex over the course of the previous year, and those that do are far more likely to use condoms than sexually-active adults, according to a new study by researchers at Indiana University (IU).

The findings come from the National Survey of Sexual Health and Behavior (NSSHB), which was completed by experts at the university’s School of Health, Physical Education, and Recreation (HPER). According to an IU press release, ” It is one of the most comprehensive studies on these topics in almost two decades” and “documents the sexual experiences and condom-use behaviors of 5,865 adolescents and adults ages 14 to 94.”

The researchers discovered that 40-percent of 17-year-old males confirmed that they had participated in vaginal intercourse within the past year, and 27-percent had done so within the past 90-days. Fourteen percent of 14-year-old boys had reported engaging in some form of sexual interaction with a partner during the previous three months, they also discovered. The full results of the study have been published in the Journal of Sexual Medicine.

Furthermore, according to what IU researcher Dr. Dennis Fortenberry told Reuters reporter Julie Steenhuysen, “In this study, somewhere between 70 and 80 percent of adolescents reported condom use at their most recent vaginal intercourse”¦ This indicates we’ve had a real public health success that we need to acknowledge.”

“One in four acts of vaginal intercourse involve condom use,” Steenhuysen added. “And among people who are single, that figure is one in three”¦ Condom use is higher among black and Hispanic Americans than among whites, and is lowest among people over 40.”

The researchers also found that most adults continue to report having sex lives that are both active and pleasurable–not to mention varied, as they discovered that those who participated engaged in a total of 40 different combinations of sexual activity during their most recent romantic encounter.

In addition, approximately 85-percent of men reported that their partner had reached orgasm during their most recent sexual event, but only 64-percent of women reported doing so, a difference that the researchers report is too large to account for using homosexual encounters. Only 7-percent of adult women and 8-percent of adult men identified themselves as gay, lesbian, or bisexual.

“This survey is one of the most expansive nationally representative studies of sexual behavior and condom use ever conducted, given the 80-year span of ages,” Michael Reece, director of the Center for Sexual Health Promotion, said in a statement.

“These data about sexual behaviors and condom use in contemporary America are critically needed by medical and public health professionals who are on the front lines addressing issues such as HIV, sexually transmissible infections and unintended pregnancy,” he added.

On the Net:

Mood-lifting South African Plant To Be Researched

A plant that has been used for hundreds of years by indigenous South Africans for reducing stress, relieving hunger, sedating and elevating moods, has now been approved for study and market, and it could soon be for sale over-the-counter worldwide.

The plant, sceletium tortuosum, has great potential and could boost the local economy, researchers studying the plant and its effects told the Associated Press (AP).

The American pharmaceutical company working on the project said, however, that it does not know if the plant has been approved by the US government, or how soon it could be available to customers.

South Africa’s environmental minister journeyed to the country’s southwest on Friday, where the plant is found, to celebrate the issuing of the license of the indigenous plant to South African HGH Pharmaceuticals.

HGH has yet to register the product, which will be marketed as a dietary supplement, as the company is still compiling data. “We’re positioning (the product) for everyday people who are having a stressful time in the office, feeling a bit of social anxiety, tension or in a low mood,” HGH’s director of research, Nigel Gericke, told AP.

The plant, known as Kanna, Channa or Kougoed by native South Africans, has been used by the San people for hundreds of years to cut down hunger, thirst and reduce fatigue. The plant is said to also have sedative, hypnotic and mood-elevating effects. It is usually chewed, but can also be made into tea or be smoked.

The plant has been extensively researched and has been found to have no ill effects or evidence of dependency, according to Ben-Erik Van Wyk, a professor of botany and plant biotechnology at the University of Johannesburg, who studied the plant.

Van Wyk, who worked with a researcher at HGH but is not involved in the project, told AP that he hopes the plant will draw attention to the wisdom of the ancient San people and their culture.

The plant gives off a minor head rush when chewed, similar to the effect of smoking a cigarette, Van Wyk said, adding it is a product with much potential. “Anyone who has chewed it and has experienced the sensation of the plant definitely knows there’s something happening.”

Traditional remedies are quite often shunned as an old-fashioned, old wives’ tale, and often outdated. “If this product becomes a huge success, the culture will become more respected and better known,” said Van Wyk.

Gericke first discovered the plant in 1985 while browsing through a botanical book in a library in Australia. When he returned to South Africa, he and a psychiatrist searched for the plant to research its doses and side effects.

HGH has an agreement with New Jersey-based PL Thomas & Company, which plans to launch the product in 2011, according to a spokeswoman.

However, it could be some time before US consumers get a chance to try a pill containing the plant’s extracts, which HGH hopes to market over-the-counter as Zembrin. The HGH spokeswoman said she was not aware whether the product has been approved by the US Food and Drug Administration.

On the Net:

Growth Of Biofuel Industry Hurt By GMO Regulations

Faster development of the promising field of cellulosic biofuels ““ the renewable energy produced from grasses and trees ““ is being significantly hampered by a “deep and thorny regulatory thicket” that makes almost impossible the use of advanced gene modification methods, researchers say.

In a new study published today in the journal BioScience, scientists argue that major regulatory reforms and possibly new laws are needed to allow cellulosic bioenergy to reach its true potential as a form of renewable energy, and in some cases help reduce greenhouse gas emissions that cause global warming.

“It’s extraordinary that gene modification technology, which has been adapted more rapidly than any other technology in the history of agriculture, and had some profound environmental and economic benefits, has been regulated virtually out of existence for perennial cellulosic biofuels crops,” said Steve Strauss, a distinguished professor of forest biotechnology at Oregon State University, and lead author of the paper.

In the report, the authors noted that exotic plant species pose a serious risk of spread and ecosystem impacts, but face much less stringent regulation or obstacles than genetically engineered crops, which are carefully designed to solve problems, not cause them.

A genetically modified plant in which one or a few genes have been changed is treated as more of a risk than an invasive species that has thousands of new genes, and as a result is often resistant to multiple pests and has novel adaptive traits such as drought and heat tolerance, theCompanies that have the technical expertise to conduct advanced research have been forced to stay away from gene modification methods, rather than adopt them to speed breeding progress and insert novel traits important to the growing biofuels industry.

Traits that could be improved with gene modification include enhanced stress tolerance, reduced costs of conversion to liquid fuels, reduced use of water and fertilizer in cultivation, avoiding dispersal into the environment, and synthesis of new, renewable products such as industrial enzymes.

But virtually none of that potential is now being developed, they said.

The current environment poses enormous legal risks that can and have cost some companies millions of dollars in civil lawsuits, the scientists said, sometimes for damages that were more of perception and market issues, than of safety or environmental impact.

“Even research on traits expressly intended to reduce environmental impacts face the same legal risks and regulatory barriers as other traits,” Strauss said.  “Our own federally-funded research on means to promote ecological containment of gene-modified and exotic biofuel crops has been brought to a standstill by regulations.”

The scientists said that the end result of a gene modification project ““ the trait produced, and whether it is safe and beneficial or not ““ should be the primary consideration for regulation, not the process used to produce it. Low-level risk and high benefit projects should be identified and allowed to move forward with much less stringent regulation or none at all. They also made several other suggestions for reform to make the overall system less slow, costly and uncertain.

“It is essential that we create an intelligent regulatory system that does not indiscriminately penalize the gene modification process and obstruct essential field research,” Strauss said. “The one-size-fits-all style system of today treats the process of genetic modification as inherently dangerous, although many high-level science panels have concluded that the process is at least as safe as conventional breeding methods.”

In some cases, the stringent regulations make it virtually impossible to do the very research needed to adequately understand issues of value and safety, the researchers said.

“The regulations in place, forthcoming, and those that have been imposed by legal actions result in the presumption that all forms of gene modified trees and grasses are “Ëœplant pests’ or “Ëœnoxious weeds’ until extensive experimentation and associated documentation “Ëœprove’ otherwise,” the scientists wrote in their report.

Solving these problems will require new ways of thinking and strong scientific and political leadership to create a regulatory system that enables, rather than arbitrarily blocks, the use of gene modification as a tool to accelerate and diversify the breeding of perennial biofuel crops, the researchers concluded.

On the Net:

How Does Salmonella Spread In Humans?

New findings by National Institutes of Health scientists could explain how Salmonella bacteria, a common cause of food poisoning, efficiently spread in people. In a study published this week in the Proceedings of the National Academy of Sciences, researchers describe finding a reservoir of rapidly replicating Salmonella inside epithelial cells. These bacteria are primed to infect other cells and are pushed from the epithelial layer by a new mechanism that frees the Salmonella to infect other cells or be shed into the intestine.

The Centers for Disease Control and Prevention estimate that Salmonella infections sicken 40,000 people each year in the United States, though the actual number of infections is likely much higher because many cases are mild and not diagnosed or reported. Currently, Salmonella is the focus of an ongoing U.S. public health investigation into contaminated chicken eggs.

“Unfortunately, far too many people have experienced the debilitating effects of Salmonella, which cause disease via largely unexplained processes, including overactive inflammatory responses,” says Anthony S. Fauci, M.D., director of NIH’s National Institute of Allergy and Infectious Diseases (NIAID). “This elegant study provides new insight into the origins of that inflammatory disease process.”

While much is known about the human infectious cycle of Salmonella, scientists have yet to understand how the bacteria escape the gut to spread infection. Epithelial cells line the outer and inner surfaces of the body, such as the skin and gut, and form a continuous protective tissue against infection. But Salmonella have learned how to live inside epithelial cells and use them for their benefit. Salmonella protect themselves within special membrane-bound compartments, called vacuoles, inside gut epithelial cells.

Using special high-resolution microscopes to view laboratory-grown human intestinal epithelial cells and laboratory mice infected with Salmonella, an NIAID research group led by Olivia Steele-Mortimer, Ph.D., in collaboration with Bruce Vallance, Ph.D., of the University of British Columbia in Vancouver, discovered a secondary population of Salmonella not confined within a vacuole, but instead moving freely inside the epithelial cells. This reservoir of Salmonella is distinct from vacuolar Salmonella. The bacteria multiply much faster; they have long tail-like projections, called flagella, used to move; and they exhibit a needle complex they use to pierce cells and inject their proteins. With these attributes, this population of Salmonella is genetically programmed to invade new cells.

The scientists observed that epithelial cells containing the hyper-replicating, invasive Salmonella are eventually pushed out of the intestinal tissue into the gut cavity, setting the Salmonella free. The mechanism used to push these Salmonella-infected cells into the body cavity resembles the natural mechanism humans use to shed dying or dead epithelial cells from their gut. The scientists believe that Salmonella have hijacked this mechanism to facilitate their own escape.

The human immune system, however, also senses that these are not normal, dying cells in the gut and triggers a response that includes release of interleukin-18, a small protein that sets off an inflammation cascade. Interleukin-18 also is prominent in chronic intestinal inflammation associated with autoimmune disorders, such as inflammatory bowel disease. The effects of interleukin-18 release provide an explanation for the acute intestinal inflammation associated with Salmonella infections.

The scientists hope their research leads to a treatment that prevents the spread of infection. They are focusing on how this specialized population of Salmonella escapes from its membrane-bound compartment to multiply and swim freely in the cell.

Reference: L Knodler et al. Dissemination of invasive Salmonella via bacterial-induced extrusion of mucosal epithelia. Proceedings of the National Academy of Sciences DOI: 10.1073/pnas.1006098107 (2010).

On the Net:

Giant Penguin Fossil Unearthed In Peru

Paleontologists have unearthed the fossilized remains of a giant penguin in Peru that lived 36 million years ago.  The discovery is the first extinct penguin ever found with preserved evidence of scales and feathers, the researchers said.

The new species, dubbed Inkayacu paracasensis, or Water King, had feathers that were reddish brown and grey, distinct from the black tuxedoed look of modern day penguins. 

At nearly five feet tall, the Inkayacu was one of the largest penguins ever to have lived, and was twice the size of an Emperor penguin, the largest penguin living today.

“Before this fossil, we had no evidence about the feathers, colors and flipper shapes of ancient penguins. We had questions and this was our first chance to start answering them,” said Dr. Julia Clarke, paleontologist at The University of Texas and lead author of a report about the discovery.

The fossil indicates that the flipper and feather shapes that make penguins such strong swimmers evolved early, while the color patterning of living penguins is likely a more recent development.

Similar to living penguins and unlike any other bird, Inkayacu’s wing feathers were radically modified in shape, densely packed and stacked on top of each other, forming stiff, narrow flippers.   Its body feathers had broad shafts that streamline the penguin’s shape, something also seen in modern day penguins.

Bird feathers derive some of their color from the size, shape and arrangement of nanoscale structures known as melanosomes.  The researchers compared melanosomes recovered from the fossil to their extensive library of those from living birds to recreate the colors of the fossil penguin’s feathers.

They found that the melanosomes in Inkayacu were similar to those in birds other than living penguins, allowing the researchers to deduce the colors they produced.

When the researchers looked at living penguins, they were surprised to find their colors were created by giant melanosomes, broader than in the newly discovered fossil and in all other birds surveyed. The melanosomes were also packed into groups that resembled clusters of grapes. 

These differences caused the researchers to question why modern penguins apparently evolved their own special way to make black-brown feathers.  The unique shape, size and arrangement of modern-day living penguin melanosomes would change the feather microstructure on the nano and micro scale, and melanin, contained within melanosomes, is known to protect feathers from wear and fracturing.  

The researchers speculated that these shifts might have had more to do with hydrodynamic demands of an aquatic lifestyle than with coloration, and that penguin colors may have shifted for entirely different reasons related to the later origin of primary predators or other changes in the late Cenozoic seas.

“Insights into the color of extinct organisms can reveal clues to their ecology and behavior,” said Jakob Vinther at Yale University, a coauthor of the report.

“But most of all, I think it is simply just cool to get a look at the color of a remarkable extinct organism, such as a giant fossil penguin,” said Vinther, who first noted fossil preservation of melanosomes in bird feathers.

Inkayacu paracasensis was discovered by Peruvian student Ali Altamirano in Reserva Nacional de Paracas, Peru.

The latest findings augment previous work by Clarke and her colleagues in Peru that challenges the conventional wisdom of early penguin evolution.  The Inkayacu and other discoveries indicate that there was a rich diversity of giant penguin species in low-latitude Peru during the late Eocene period about 36 to 41 million years ago.

“This is an extraordinary site to preserve evidence of structures like scales and feathers,” Clarke said.

“So there’s incredible potential for new discoveries that can change our view of not only penguin evolution, but of other marine vertebrates.”

The findings appear in the Sept. 30 edition of the journal Science.

Image 1: Artist reconstruction of Inkayacu paracasensis. Credit: Katie Browne

Image 2: Fossil wing feathers of Inkayacu paracasensis. Credit: University of Texas at Austin

Image 3: Julia Clarke exposing wing features in the Inkayacu specimen. Credit: N. Adam Smith

On the Net:

Researchers At The University Of Granada Associate Trigger Points With Shoulder Injury

Researchers of the University of Granada, in collaboration with the Centre for Sensory-Motor Interaction of the University of Aalborg, Denmark, and the University Rey Juan Carlos, Madrid, conducted a research on chronic impingement syndrome. The study revealed that excessive activation of specific neck and shoulder muscles during daily life or while playing sports ““as swimming”“ is the cause of a high number of injury and shoulder.

A Common Pain

The pattern of the pain originated in these muscles ““sometimes in regions far from the shoulder”“ coincides with most of the symptoms suffered by patients attending health care centers for this type of problem. 25 out of 1,000 visits to the family doctor are related to shoulder pain, and the causes of this problem are several. The chronic impingement syndrome is considered the main cause for shoulder pain and disability.

The study conducted by Amparo Hidalgo-Lozano, Ph.D candidate of the department of Physiotherapy of the University of Granada, under the supervision of professor Manuel Arroyo Morales, opens the door to future tests to the efficacy of Physiotherapy as a non-invasive treatment for shoulder injury. Shoulder injury is the cause of 13% of sick leaves and implies a cost of 7 billion dollars in the USA.

This research was published in the last issue of the prestigious journal Experimental Brain Research.

On the Net:

Study Finds Women With Triple Negative Breast Cancer And BRCA Mutations Have Lower Risk Of Recurrence

Patients with triple negative breast cancer that also have mutations in the BRCA gene appear to have a lower risk of recurrence, compared to those with the same disease without the deleterious genetic mutation, according to researchers at The University of Texas MD Anderson Cancer Center.

The findings may offer a direction for study of personalized therapy in this select group of triple negative breast cancer patients, as well as highlight the unique need for genetic testing in a patient population. Ana M. Gonzalez-Angulo, M.D., associate professor in MD Anderson’s Departments of Breast Medical Oncology and Systems Biology presented the findings in advance of the 2010 Breast Cancer Symposium.

“There is data on the number of breast cancer patients with BRCA mutations, as well as those that have triple negative disease. However, there is no understanding of the incidence of BRCA1 and 2 mutations in unselected patients with triple negative breast cancer,” said Gonzalez-Angulo, the study’s first and corresponding author. “Now, there are new drugs that appear to be more effective in treating triple negative breast cancer and BRCA status may be an important way of selecting patients that may respond to these therapies.”

Triple negative disease – breast cancer that is estrogen, progesterone and HER2-neu receptor negative – accounts for about 15 percent of all breast cancers. Currently, it’s an area of much research focus in the breast cancer community because: it lacks effective targets effective for anti-cancer therapies; chemotherapy is only effective in about 40 percent of patients; and in those that do relapse, the disease is highly resistant and patients die quickly.

PARP inhibitors, a class of drugs of growing interest in cancer research, have shown promise in both patients with BRCA and triple negative disease. PARPs appear to be more effective in patients with BRCA mutations, as both PARP enzymes and proteins produced by the BRCA genes are involved in the repair of DNA. Therefore, the MD Anderson finding may provide an early idea of how to select those triple negative breast cancer patients that may respond best to therapy.

For this study, part of Gonzalez-Angulo’s ongoing laboratory project, Molecular Characterization of Triple Negative Breast Cancer, the researchers sent both tumor and normal tissue of 77 women with triple negative disease to Myriad Genetics Inc. to identify germline (inherited) and somatic (in tumor only) BRCA mutations. Of those 77 patients, 15 (19.5 percent), were found to have mutations (14 germline, one somatic) -12 (15.6 percent) with BRCA1 and three (3.9 percent) with BRCA2.

The triple negative breast cancer patients were treated at MD Anderson between 1987 and 2006, and all but one received the same adjuvant chemotherapy regimen. The median follow-up was 43 months. The five-year relapse-free and five-year overall survival of the patients with either BRCA mutation, was 86,2 percent, and 73.3 percent, respectively, compared to 51.7 percent and 52.8 percent, respectively, in patients lacking mutations.

The researchers were surprised by the findings, however, Gonzalez-Angulo notes that prior studies conducted were case-controlled looking at BRCA mutations carriers with all types of breast cancer. The MD Anderson study is the first to look exclusively at women with triple negative breast cancer, an unselected population.

Also surprising, the incidence of BRCA mutations in the triple negative breast cancer population was higher than expected, said Gonazlez-Angulo.

“It was interesting to find that a good portion of these women were not sent to genetic counseling – some didn’t meet the criteria to be sent for testing, however they still had BRCA mutations,” said Gonazlez-Angulo. “Perhaps we need to lower our threshold for patients with triple negative breast cancer for genetic counseling and to assess for mutation status – especially those under age 50 – despite not having the significant family history as others.”

As a follow up, Gonzalez-Angulo plans to continue her ongoing laboratory research, with signaling pathways, RNA, DNA, and other mutations of the disease.

On the Net:

Hepatitis C Virus Faces New Weapon From Florida State Scientists

In recent human trials for a promising new class of drug designed to target the hepatitis C virus (HCV) without shutting down the immune system, some of the HCV strains being treated exhibited signs of drug resistance.

In response, an interdisciplinary team of Florida State University biologists, chemists and biomedical researchers devised a novel genetic screening method that can identify the drug-resistant HCV strains and the molecular-level mechanisms that make them that way ““”“ helping drug developers to tailor specific therapies to circumvent them.

The potentially life-saving technology also works when screening other viruses with drug-resistance issues, notably human immunodeficiency virus (HIV) and influenza.

More than 170 million people worldwide are infected with HCV, which leads to both acute and chronic liver diseases.

“In collaboration with pharmaceutical firm Gilead Sciences and researchers from the University of Heidelberg (Germany), what our research team discovered was how the latest drug for HCV works and what changes in the virus that makes it resistant to this unique therapy,” said Hengli Tang, a Florida State University molecular biologist.

“This is knowledge that is essential to drug developers focused on HCV,” said Tang, “but equally important is that our method, which we call “ËœCoFIM’ (Cofactor-independent mutant) screening, can also be applied to other drug targets and other viruses.

“And, since we now understand how this latest class of drug works and what causes resistance to it, we can better select other classes of drugs with distinct mechanisms ““”“ in other words, those that target other parts of the virus ““”“ in order to craft a combination therapy, which is the future of HCV therapy and the key to overcoming drug resistance.”

The groundbreaking research is described in a paper published online in the September 2010 issue of the journal PLoS Pathogens.

Florida State biology doctoral student Feng Yang led the research team. The award-winning scholar earned her Ph.D. in August 2010 and is now a postdoctoral associate at Yale University. Yang designed the CoFIM screening methodology with fellow FSU graduate students, postdoctoral associates and distinguished faculty colleagues ““”“ including Associate Professor Tang; chemistry/biochemistry Professor Timothy M. Logan, director of FSU’s Institute of Molecular Biophysics; and Research Assistant Professor Ewa A. Bienkiewicz, of the FSU College of Medicine, where she directs the Biomedical Proteomics Laboratory.

Driving the team’s development of CoFIM screening was the need to identify key “cellular cofactors” and their mechanisms of action ““”“ a fundamental aspect of virus-host interaction research.

“‘Cellular cofactors’ are proteins that normally exist in host cells that have been hijacked by viruses to facilitate viral replication.” Tang said. “They became accomplices to the invading viruses.

“Our research team was the first to show that “Ëœcyclophilin A’ (CyPA) is an essential cellular cofactor for hepatitis C virus infection and the direct target of a new class of clinical anti-HCV compounds, which include cyclosporine A (CsA)-based drugs that are devoid of immunosuppressive function,” Tang said.

“In addition, we went a step further than other research teams by employing our newly developed CoFIM screening method, which we used to demonstrate not only HCV’s dependence on cellular cofactor cyclophilin A and susceptibility to cyclosporine A drugs but also to uncover the molecular-level regulators that determine those two traits in the virus.”

Those molecular-level regulators are known as “small interfering RNA libraries” ““”“ collections of molecules so named for their size and ability to suppress gene expression. They act to individually suppress every gene in the cell, resulting in different consequences depending upon which gene is suppressed by a given member in the library.

The CoFIM screening method involves inducing or “coaxing” the HCV virus to mutate by itself, in vitro, absent the replication assistance it normally receives from a particular cellular cofactor. Then, CoFIM tracks the changes in the virus’s response both to CsA-based drugs and any other drug designed to inhibit the cofactor.

Funding for the research conducted at Florida State University came in largest part from a $1.4 million grant awarded by the National Institutes of Health (NIH). And, because chronic liver disease caused by HCV can lead to liver cancer, a grant from the American Cancer Society provided additional support.

On the Net:

Study Finds Gene Variants Governing Human Height

Researchers from more than 200 institutions in the U.S. and Europe have identified hundreds of genetic variants that in total account for roughly 10 percent of the inherited variation in human height.

The consortium of researchers, named GIANT for Genetic Investigation of ANthropometric Traits, combined data from more than 180,000 individuals, including millions of genetic results from each of 46 separate studies in the U.S., Canada, Europe and Australia. Using this data, they were able to identify hundreds of genetic variants associated with height located in at least 180 different spots in the genome.

The study revealed that these variants consistently cluster around genes from at least six different biological pathways, many of which are located near genes already known to be involved in skeletal growth syndromes.  However, others implicate previously unrecognized genetic growth regulators, suggesting new areas for biological research on human height.

“Height clearly has a lot to do with genetics ““ shorter parents tend to have shorter children, and taller parents tend to have taller children,” said Joel Hirschhorn of Children’s Hospital Boston, the Broad Institute and Harvard Medical School and a co-senior author of a report about the study.

“This paper is the biggest step forward to date in understanding which of the genetic variants that differ between people account for our differences in height,” he said.

To search for genes affecting height, Hirschhorn and colleagues used something called genome-wide association (GWA) studies.  Such studies sample millions of sites of genetic variation in large groups of people, and then analyze the data to search for consistent differences associated with any of the variants in the genome.

When studying large enough populations, these variants can point to genes that contribute to traits such as height, which vary widely among people for many different genetic reasons.

“A good bit of the genetic differences in height is going to be explained by common variants that individually have very small effects,” said Hirschhorn.

“We all carry many different variants that each make us slightly taller or shorter.”

The study found that at least 19 locations in the genome had multiple variants independently associated with height, suggesting that the nearby genes are significant in regulating childhood growth and may account for a considerable fraction of existing height-related variation.

The GIANT GWA study of height, which discovered more genetic variants influencing a trait than any prior genetic study, demonstrates the value of GWA studies with large populations.

“With enough statistical power, you can identify lots of loci, and clearly relevant biological pathways emerge that were not evident in smaller studies,” said Hirschhorn.

The study also has implications in the role of rare versus common genetic variants. Many scientists believe that genetic variants that alter gene function and influence diseases or traits are rare (occurring in 1 percent or less of the population), since natural selection would tend to eliminate them from the general population.

“What we show is that for at least some cases, it’s the common variant that has the effect,” says Hirschhorn.

“More importantly, they cluster in consistent areas near or within particular genes that highlight biological pathways ““ they’re not randomly strewn across the genome.”

Despite the large number of genetic regions identified by the study, the fraction of variation in height explained by these loci is only about 10 percent.

“Genome-wide association studies are very powerful tools, but even so, we are still some way short of understanding the full details of how differences in our genomes influence common human traits such as height,” said Timothy Frayling PhD of the University of Exeter (UK), co-senior author of the report.

“Complex traits such as height are proving even more complex than we had first thought. We will need even more powerful tools and different approaches if we are to understand fully the differences between individuals,” he said.

Height is a classic model for genetic research on complex traits because it is easily measured and varies greatly from person to person.  And unlike Mendelian genetics where a single inherited gene determines whether a pea is green or yellow, height has many gradations and is the sum product of multiple genes.

“If we can understand the genetics of height, it will help us understand how other polygenic traits are inherited,” said Hirschhorn.

The GIANT study was published in the September 29 advance online edition of the journal Nature.

On the Net:

Animal Study: Blueberries Help Fight Artery Hardening

Blueberries may help fight atherosclerosis, also known as hardening of the arteries, according to results of a preliminary U.S. Department of Agriculture (USDA)-funded study with laboratory mice. The research provides the first direct evidence that blueberries can help prevent harmful plaques or lesions, symptomatic of atherosclerosis, from increasing in size in arteries.

Principal investigator Xianli Wu, based in Little Rock, Ark., with the USDA Agricultural Research Service (ARS) Arkansas Children’s Nutrition Center and with the University of Arkansas for Medical Sciences, led the investigation. The findings are reported in the current issue of the Journal of Nutrition.

Atherosclerosis is the leading cause of two forms of cardiovascular disease–heart attacks and strokes. Cardiovascular disease is the number one killer of Americans.

The study compared the size, or area, of atherosclerotic lesions in 30 young laboratory mice. Half of the animals were fed diets spiked with freeze-dried blueberry powder for 20 weeks; the diet of the other mice did not contain the berry powder.

Lesion size, measured at two sites on aorta (arteries leading from the heart), was 39 and 58 percent less than that of lesions in mice whose diet did not contain blueberry powder.

Earlier studies, conducted elsewhere, have suggested that eating blueberries may help combat cardiovascular disease. But direct evidence of that effect has never been presented previously, according to Wu.

The blueberry-spiked diet contained 1 percent blueberry powder, the equivalent of about a half-cup of fresh blueberries.

All mice in the investigation were deficient in apolipoprotein-E, a trait which makes them highly susceptible to forming atherosclerotic lesions and thus an excellent model for biomedical and nutrition research.

Wu’s group wants to determine the mechanism or mechanisms by which blueberries helped control lesion size. For example, by boosting the activity of four antioxidant enzymes, blueberries may have reduced the oxidative stress that is a known risk factor for atherosclerosis.

In followup studies, Wu’s group wants to determine whether eating blueberries in infancy, childhood and young adulthood would help protect against onset and progression of atherosclerosis in later years. Early prevention may be especially important in light of the nation’s epidemic of childhood obesity. Overweight and obesity increase atherosclerosis risk.

On the Net:

Injunction Prohibiting Stem Cell Funding Lifted

Federal funding of embryonic stem cell research will be allowed to continue for the time being, the U.S. Court of Appeals in the Washington, D.C. Circuit ruled on Tuesday.

The decision lifts an injunction put into place by Chief Judge Royce Lamberth on August 23.

According to Reuters, the three-judge panel ruled in favor of those supporting the National Institutes of Health (NIH) guidelines, stating that they had “satisfied the standards required for a stay pending appeal.” That “unusually quick” decision, in the words of AP writer Nedra Pickler, came just one day after the judges heard arguments from both sides on Monday.

Lamberth had previously ruled that the Obama administration’s policy regarding the funding violated the Dickey-Wicker amendment, a federal law barring the use federal tax funds to sponsor research that would cause human embryos to be destroyed.

“Congress remains perfectly free to amend or revise the statute,” Lamberth had written in a September 9 order upholding the injunction. “This court is not free to do so.”

During Monday’s testimony, Deputy Assistant Attorney General Beth Brinkmann told the appeals court judges that the injunction would stop funding to 24 research projects at the NIH–projects which, she said, had already received more than $60 million in taxpayer funds.

When challenged on her claims by Judge Thomas Griffith, who asked whether or not the work for be irreparably harmed due to the injunction, she responded that it would “be a setback” and that biological materials would have to be destroyed as a result.

Joining Griffith on the panel were justices Judith Rodgers and Brett Kavanaugh. Rodgers was appointed by former President Bill Clinton, while both Griffith and Kavanaugh were appointees of former President George W. Bush.

Federal funding of embryonic stem cell research was banned by the Bush administration in 2001. Current President Barack Obama overturned that ban in March 2009, who according to CNN said at the time that he was convinced that the government had been “forced” into “what I believe is a false choice between sound science and moral values.”

“President Obama made expansion of stem cell research and the pursuit of groundbreaking treatments and cures a top priority when he took office,” White House spokesman Robert Gibbs said in a statement following Tuesday’s ruling. “We’re heartened that the court will allow NIH and their grantees to continue moving forward while the appeal is resolved.”

However, as Jeremy Pelofsky of Reuters points out in a Wednesday morning article, “Even with funding allowed to continue, possibly only temporarily, the White House could turn to Congress in hopes lawmakers will rewrite the law to be clearer on the issue”¦ NIH could also try to rewrite its guidelines to conform with the law, or the White House could appeal to the Supreme Court if the appeals court rules against it on the merits of the case.”

On the Net:

Potential Climate Change Side Effect – More Parasites An South American Birds

Study finds higher temperatures, precipitation levels mean greater harm by parasites to developing chicks

A Wildlife Conservation Society (WCS) study on nesting birds in Argentina finds that increasing temperatures and rainfall””both side effects of climate change in some parts of the world””could be bad for birds of South America, but great for some of their parasites which thrive in warmer and wetter conditions.

The study, which looked at nesting forest birds in Santa Fe, Argentina, found that increases in temperature and precipitation produce a bumper crop of parasitic fly larvae of the species Philornis torquans, parasites that burrow into the skin of baby birds to feed. The researchers also found that these greater parasite burdens result in higher probability of mortality and impaired growth for the parasitized chicks.

The study now appears in the online edition of the Journal of Zoology “” published by the Zoological Society of London. The authors of the study are: Pablo Beldomenico of the Wildlife Conservation Society’s Global Health Program; and Leandro Antoniazzi, Darío Manzoli, David Rohrmann, María Jos© Saravia, Leonardo Silvestri of Universidad Nacional de Litoral in Argentina.

“Although ours is a short-term study looking at within-year variability, we clearly show that higher temperature and precipitation result in greater parasitic fly loads. This is a striking example of the kind of negative effects on wildlife that can arise as a result of climate change,” said Dr. Pablo Beldomenico of the Wildlife Conservation Society’s Global Health Program. “The greater precipitation and warmer weather predicted for some areas of South America could have a significant impact on native birds because of a large increase in parasites like these.”

Carried out by field veterinarians and biologists between September and March of 2006-7 and 2007-8, the study focused on both the prevalence and abundance of parasitic larvae in the study area’s bird community and the impact of parasites on the growth and survival of bird nestlings.

The researchers also examined the influence of environmental factors on parasite prevalence and abundance, noting a positive correlation between variations in climatic variables (temperature and precipitation levels) and parasite loads on nestlings. They found that increases in temperature and rainfall resulted in more parasites.

During the course of the study, researchers examined the nests of 41 bird species (715 chicks) within a 30-hectare area (74 acres) of forest, gathering data on nest height, brood size, body mass of chicks, and the number of parasites on each bird. The fly larvae””large in relation to the size of the chick””were easily identified by the bulges on the heads, bodies and wings of the baby birds. The parasites were found on half (20) of the bird species studied, with the majority found on only four passerine species: the great kiskadee, the greater thornbird, the little thornbird, and the freckle-breasted thornbird. These species were monitored every three days for data on the impact of parasites on survival and growth.

Predictably, researchers found that the more larvae the baby birds carried, the higher the chance of mortality; chicks with 10 larvae were twice as likely to die as chicks without parasites. One chick had as many as 47 larvae on its body. The fly larvae also impacted the growth rates of the baby birds; in five days, chicks that hosted 10 larvae grew 1.85 fewer grams than chicks that were parasite-free.

“Understanding how environmental factors influence the health of wildlife populations, and how this is changing in response to climate change, will help inform strategies to mitigate its deleterious effects,” said Dr. Paul Calle, Director of Zoological Health for the Wildlife Conservation Society’s Global Health Program.

Ongoing studies funded by Morris Animal Foundation will shed new light on the ecology of Philornis and their impact on chicks in the realm of climate change.

Image Caption: Veterinarians from the Wildlife Conservation Society and other organizations conducted a study examining the role of environmental factors on the abundance of parasites on South American birds. This great kiskadee chick is covered with parasitic fly larvae, the result of increased temperatures and precipitation levels that possibly stem from climate change. Credit: Pablo Beldomenico

On the Net:

Carbonation Sparks Pain Circuits

Fizzy beverages light up same pain sensors as mustard and horseradish, a new study shows — so why do we drink them?

You may not think of the fizz in soda as spicy, but your body does.

The carbon dioxide in fizzy drinks sets off the same pain sensors in the nasal cavity as mustard and horseradish, though at a lower intensity, according to new research from the University of Southern California.

“Carbonation evokes two distinct sensations. It makes things sour and it also makes them burn. We have all felt that noxious tingling sensation when soda goes down your throat too fast,” said Emily Liman, senior author of a study published online in the Journal of Neuroscience.

That burning sensation comes from a system of nerves that respond to sensations of pain, skin pressure and temperature in the nose and mouth.

“What we did not know was which cells and which molecules within those cells are responsible for the painful sensation we experience when we drink a carbonated soda,” said Liman, an associate professor of neurobiology in the USC College of Letters, Arts and Sciences.

By flowing carbonated saline onto a dish of nerve cells from the sensory circuits in the nose and mouth, the researchers found that the gas activated only a particular type of cell.

“The cells that responded to CO2 were the same cells that detect mustard,” Liman said.

These cells express a gene known as TRPA1 and serve as general pain sensors.

Mice missing the TRPA1 gene showed “a greatly reduced response” to carbon dioxide, Liman said, while adding the TRPA1 genetic code to CO2-insensitive cells made them responsive to the gas.

Now that carbonated beverages have been linked to pain circuits, some may wonder why we consume them. A new park in Paris even features drinking fountains that dispense free sparkling water.

Liman cited studies going back as far as 1885 that found carbonation dramatically reduced the growth of bacteria.

“Or it may be a macho thing,” she speculated.

If only a sip of San Pellegrino were all it took to prove one’s hardiness.

The pain-sensing TRPA1 provides only one aspect of carbonation’s sensory experience. In 2009, a group led by Charles Zuker of the University of California, San Diego and Nicholas Ryba of the National Institutes of Health showed that carbonation trips cells in the tongue that convey sourness.

Liman’s collaborators were lead author Yuanyuan Wang and second author Rui Chang, both graduate students in neurobiology at USC.

The National Institutes of Health funded the research.

On the Net:

New Biomarkers Discovered For Pancreatic Cancer And Mesothelioma

Using a novel aptamer-based proteomics array technology, researchers and collaborators have identified biomarkers and protein signatures that are hallmarks of cancer at an early stage for two of the most aggressive and deadly forms of cancer “” pancreatic and mesothelioma.

This technology would enable better clinical diagnosis at an earlier stage and may provide insight into new therapeutic targets, said Rachel Ostroff, Ph.D., clinical research director of Somalogic Inc.

“Currently these cancers are detected at an advanced stage, where the possibility of cure is minimal,” said Ostroff. “Detection of these aggressive cancers at an earlier stage would identify patients for early treatment, which may improve their survival and quality of life.”

Ostroff presented results of this ongoing study at the Fourth AACR International Conference on Molecular Diagnostics in Cancer Therapeutic Development.

Discovered about 20 years ago, aptamers are nucleic acid molecules that bind to specific proteins. SomaLogic has developed the next generation of aptamers, SOMAmers (Slow Off-rate Modified Aptamers), which have superior affinity and specificity. SOMAmers enable a highly multiplexed proteomic platform used for simultaneous identification and quantification of target proteins in complex biological samples.

The goal of this study was to determine if this proteomics technology could identify blood-based biomarkers for pancreatic cancer or mesothelioma in people diagnosed, but not yet treated, for cancer.

Participants in the control group had symptoms that resembled these cancers, but were benign (i.e. pancreatitis or lung fibrosis).

Ostroff and colleagues tested blood from participants to discover the biomarkers specific to those with cancer, which would then be used to identify these diseases at an early stage, where the potential for effective treatment is much higher than in disease that has progressed.

For both forms of cancer, the researchers discovered biomarkers and developed a signature with high accuracy for detection of each form of cancer. Equally important, they found high specificity, meaning few people without disease will be incorrectly diagnosed and thus avoid unnecessary tests or treatments.

“Validation studies are underway, which we hope will lead to the development of diagnostic tests that hold clinical benefits for patients,” Ostroff said.

Pancreatic cancer is the fourth leading cause of cancer-related death in the United States. Mesothelioma is an asbestos-related pulmonary cancer that causes an estimated 15,000 to 20,000 deaths per year worldwide.

On the Net:

China Makes Final Launch Preparations For Lunar Probe

State media reported Tuesday that China is making final preparations to launch its second lunar probe, possibly as soon as Friday, when the country marks 61 years of communist rule.

The official China Daily reported that a launch rocket carrying the Chang’e-2 has been set up in the southwestern province of Sichuan.

It said that chief program engineers have arrived at the satellite launch center in the city of Xichang to carry out final tests.  The launch could take place on October 1, assuming the staff finds no complications.

State media reported that the lunar probe will conduct different tests in preparations for the expected launch in 2013 of the Chang’e-3, which aims to be China’s first unmanned landing on the moon.

The Chang’e program is seen as an effort to put China’s space exploration program on par with those of the U.S. and Russia.

China launched Chang’e-1 in October 2007.  That lunar probe orbited the moon and took high-resolution pictures of the lunar surface.

According to state media, the country hopes to bring a moon rock sample back to earth in 2017, with a manned mission foreseen in around 2020.

China became the world’s third nation to put a man in space independently when Yang Liwei piloted the one-man Shenzhou-5 space mission in 2003.

Youth Still Overexposed To Smoking And Drinking Ads

A group of doctors said that despite severe restrictions on tobacco advertising, youths are still overexposed to media depicting smoking and drinking in a favorable light.

“We are 65,000 pediatricians who are vitally concerned with the health of children,” Dr. Victor Strasburger of the American Academy of Pediatrics told Reuters Health.

“With nearly half of kids at least trying smoking, and with more than 400,000 Americans dying every year from tobacco, the academy feels it is really time to ban all tobacco advertising.”

The academy published the policy statement in its journal Pediatrics on Monday.  The statement also recommended limiting alcohol advertising and exposure of children to PG-13 and R-rated movies.

The study found that over $25 billion are spent every year on advertising tobacco, alcohol and prescription drugs.

“Parents have gotten caught up with all the hard drugs — cocaine, steroids — but they fail to realize that tobacco and alcohol are still by far the leading drugs among teenagers,” Strasburger, also of the University of New Mexico in Albuquerque, told Reuters Health.

The U.S. Centers for Diseases Control and Prevention said that about 46 million Americans smoke.  Relative to the population, that number is down by over half since 1965, although millions still succumb to smoking related illnesses every year.

One in five high school students smoked cigarettes in 2007. 

Strasburger said that banning tobacco ads and promotions in all media worked and had decreased smoking rates in other countries like the U.K. and Australia.

“At its most fundamental level, we agree with the academy,” Maura Payne, a spokeswoman for Reynolds American Inc., the second largest U.S. tobacco company, told Reuters. “Kids shouldn’t smoke.”

She said that the bulk of industry advertising was not advertising, but price discounts.

She said that advertising in a traditional sense is limited to password-protected websites, in-store ads, adult smokers who have requested to be on a mailing list and magazines with a readership of at least 85 percent adults.

Payne added that these ads are not meant to get more people hooked.

“Because the number of adults who smoke is declining every year, the name of the game is brand switching,” she told Reuters Health.

However, even if advertising is restricted, exposure to smoking is still common in the media.  The study authors said that three-quarters of G-, PG-, and PG-13-rated movies contain smoking scenes.

Strasburger told Reuters Health that several studies suggest smoking on television and in movies is a key factor in getting teens to pick up the habit.

“Parents need to understand that kids are spending seven hours a day with media and that the media have become one of the leading drug educators today,” he said.

The academy recommends removing televisions from children’s bedrooms and limiting access to channels with “excessive substance use depictions” like MTV, HBO, Showtime and Comedy Central.

That might also help cutting down on exposure to alcohol ads.  The authors said, “A sample of 9- to 10-year-olds could identify the Budweiser frogs nearly as frequently as they could Bugs Bunny.”

They also say advertisements for prescription medicines like the erectile dysfunction drug Viagra are far too common on television compared to ads for condoms, which many networks do not air.

“Children and teenagers get the message that there is a pill to cure all ills and a drug for every occasion, including sexual intercourse,” they write.

On the Net:

Virgin Galactic Space Service To Launch In 18 Months

Speaking before a business conference in Kuala Lumpur, entrepreneur Richard Branson confirmed that his Virgin Galactic company will be ready to begin commercial space flights within the next 18 months, French news agency AFP reported on Monday.

“We just finished building SpaceShipTwo. We are 18 months away from taking people into space,” the British billionaire told those attending the conference. According to the AFP, he also noted that the cost of the service would begin at $200,000.

Furthermore, Branson added that his company was looking to create destinations for their travelers, telling AFP that they were “looking” at creating hotels in space, possibly on the moon. He also told reporters that Virgin Galactic was considering launching “small satellites into space” to help schools, colleges, and universities.

In July, SpaceShipTwo (SS2), also known as the VSS Enterprise, completed its first crewed flight–a six-plus hour flight that started at Mojave Air and Space Port in California. SpaceShipTwo, which was built by aviation engineer Burt Rutan, was launched with the assistance of WhiteKnightTwo (WK2), also known as VMS Eve. The crew consisted of Mark Stucky, Peter Kalogiannis, and Brian Maisler on board the VMS Eve, and Peter Siebold and Michael Alsbury onboard the VMS Enterprise.

According to the AFP report, Virgin Galactic had already collected $45 million in deposits from more than 300 individuals reserving seats on board the six-person spacecraft. The news agency notes that the craft will ascend to a height of 50,000 feet before shedding the WK2 and continuing on into suborbital space. There, passengers onboard SS2 will be allowed to see the Earth from space and experience floating in zero gravity conditions.

Construction on SS2, which has been dubbed the world’s first manned commercial spacecraft by its creators, began in 2007. The vehicle was officially unveiled in December 2009.

“This is truly a momentous day,” Branson said in a statement at the time. “The team has created not only a world first but also a work of art. The unveil of SS2 takes the Virgin Galactic vision to the next level and continues to provide tangible evidence that this ambitious project is not only moving rapidly, but also making tremendous progress towards our goal of safe commercial operation.”

Image Caption: Sir Richard Branson with model of SpaceShipTwo. Credit: Virgin Galactic

On the Net:

Saudi Arabia Earthquakes Caused By Volcano

Geologists reported on Sunday that a swarm of small earthquakes that struck western Saudi Arabia last year was actually the rumbling of a volcano.

Over 30,000 minor quakes took place between April and June 2009 within an ancient solidified lava field called Harrat Lunayyir, which damaged some buildings near Al Ays and prompted the authorities to evacuate the 40,000 residents from the region.

Most of the quakes measured less than two on the scale of magnitude, but several were hefty, delivering a jolt of up to 5.4.

Geologists concluded by using satellite radar that the shockwaves’ seismic signature and depth all point to a cause that is volcanic.

They found that the ground ruptured along five miles and ripped open about one and a half feet as a tentacle of magma probed forward just beneath the surface.

However, the experts say that the hazard is low, given the remoteness of the site and the type of eruption they expect.

Saudi Arabia’s geology is known for the oil-drenched sedimentary rocks of the east that are the source of its bounty in hydrocarbons.

Volcanic eruptions in Saudi Arabia are rare and only occur every few hundred years.

According to contemporary accounts, the best known volcanic event in the region occurred in 1256, which sent flows of lava “like a red-blue boiling river” for 52 days into the holy city of Medina.

The lead researcher John Pallister of the Volcano Disaster Assistance Program at the U.S. Geological Survey (USGS) told AFP that on a geological scale, volcanism in Saudi Arabia is contemporary.

“Several of the lava fields have ‘young-looking’ features (to a geologist) and even have deposits that overlie Neolithic [Stone Age] sites,” Pallister said in an email exchange with AFP.

Last year’s event occurred about 120 miles from the main area of geological spreading, which is happening beneath the Red Sea.

Pallister told AFP that despite this distance, the intrusion of magma points to an “increased chance” of eruptions “in the next several decades.”

However, he cautioned against fear.

“An eruption, at Lunayyir, if it were to occur, would pose little hazard due to the type of volcanism expected at the site and the remoteness of the vent areas,” he told AFP.

“There (is) a low probability of large damaging earthquakes related to this type of activity.”

“However, urban development is encroaching on other areas in Saudi Arabia where an eruption would be more serious.”

He talked positively about the Saudi Geological Survey for its rapid response.

“They quickly recognized the hazard and deployed a first-class seismic monitoring network and advised their government and citizens of the status of the unrest and the potential hazards,” Pallister told AFP.

“The Saudi Survey now has a very good network for seismic monitoring of Lunayyir. Consequently, I expect that they should be able to make a robust forecast in the case of renewed unrest.”

The study is published in the journal Nature Geoscience.

Image Caption: Lava flows radiate down desert valleys away from the center of Harrat Lunayyir, a basaltic volcanic field in NW Saudi Arabia, east of the Red Sea port of Umm Lajj. Harrat Lunayyir contains about 50 volcanic cones that were constructed along a N-S axis. Harrat Lunayyir is one of the smallest of the Holocene lava fields of Saudi Arabia, but individual flow lobes extend up to about 30 km from the center of the Harrat. One of the cones may have erupted around the 10th century AD or earlier. NASA Space Shuttle image STS26-41-61, 1988

On the Net:

Water Buffalo, Goats Can Distort Stone Age Sites

Taking a new look at old digs: Trampling animals can alter muddy Paleolithic sites

Archaeologists who interpret Stone Age culture from discoveries of ancient tools and artifacts may need to reanalyze some of their conclusions.

That’s the finding suggested by a new study that for the first time looked at the impact of water buffalo and goats trampling artifacts into mud.

In seeking to understand how much artifacts can be disturbed, the new study documented how animal trampling in a water-saturated area can result in an alarming amount of disturbance, says archaeologist Metin I. Eren, a graduate student at Southern Methodist University and one of eight researchers on the study.

In a startling finding, the animals’ hooves pushed artifacts as much as 21 centimeters into the ground “” a variation that could equate to a difference of thousands of years for a scientist interpreting a site, said Erin.

The findings suggest archaeologists should reanalyze some previous discoveries, he said.

“Given that during the Lower and most of the Middle Pleistocene, hominids stayed close to water sources, we cannot help but wonder how prevalent saturated substrate trampling might be, and how it has affected the context, and resulting interpretation, of Paleolithic sites throughout the Old World,” conclude the authors in a scientific paper detailing their experiment and its findings.

“Experimental Examination of Animal Trampling Effects on Artifact Movement in Dry and Water Saturated Substrates: A Test Case from South India” has been published online by the Journal of Archaeological Science. The research was recognized as best student poster at the 2010 annual meeting of the Society for American Archaeology.

Animal trampling not new; current study adds new variable

The idea that animal trampling may reorient artifacts is not new.

“Believe it or not, there have been dozens of trampling experiments in archaeology to see how artifacts may be affected by animals walking over them. These have involved human trampling and the trampling of all sorts of animals, including elephants, in dry sediments,” Eren said. “Our trampling experiments in dry sediments, for the most part, mimicked the results of previous experiments.”

But this latest study added a new variable to the mix “” the trampling of artifacts embedded in ground saturated with water, Eren said.

Researchers from the United States, Britain, Australia and India were inspired to perform the unique experiment while doing archaeological survey work in the Jurreru River Valley in Southern India.

They noticed that peppering the valley floor were hardened hoof prints left from the previous monsoon season, as well as fresh prints along the stream banks. Seeing that the tracks sunk quite deeply into the ground, the researchers began to suspect that stone artifacts scattered on the edges of water bodies could be displaced significantly from their original location by animal trampling.

Early humans drawn to water’s edge

“Prehistoric humans often camped near water sources or in areas that receive lots of seasonal rain. When we saw those deep footprints left over from the previous monsoon season, it occurred to us that animal trampling in muddy, saturated sediments might distort artifacts in a different way than dry sediments,” Eren said. “Given the importance of artifact context in the interpretation of archaeological sites and age, it seems like an obvious thing to test for, but to our surprise it never had been.”

Eren and seven other researchers tested their theory by scattering replicated stone tools over both dry and saturated areas of the valley. They then had water buffalo and goats trample the “sites.” Once sufficient trampling occurred, the archaeologists proceeded to excavate the tools, taking careful measurements of where the tools were located and their inclination in the ground.

The researchers found that tools salted on ground saturated with water and trampled by buffalo moved up to 21 centimeters vertically, or a little more than 8 inches. Tools trampled by goats moved up to 16 centimeters vertically, or just over 6 inches.

“A vertical displacement of 21 centimeters in some cases might equal thousands of years when we try to figure out the age of an artifact,” Eren said. “This amount of disturbance is more than any previously documented experiment “” and certainly more than we anticipated.”

A new “diagnostic marker” for interpreting sites

Unfortunately for archaeologists who study the Stone Age, artifacts left behind by prehistoric humans do not stay put, said Eren. Over thousands or even millions of years, all sorts of geological or other processes can move artifacts out of place, he said.

The movement distorts the cultural and behavioral information that is contained in the original artifact patterning, what archaeologists call “context.” Archaeologists must discern whether artifacts are in their original context, and thus provide reliable information, or if they’ve been disturbed in some way that biases the interpretation, Eren said.

Given that artifacts embedded in the ground at vertical angles appear to be a diagnostic marker of trampling disturbance, the researchers concluded that sites with water-saturated sediments should be identified and reanalyzed.

Other researchers on the study include Adam Durant, University of Cambridge; Christina Neudorf, University of Wollongong; Michael Haslam, University of Oxford; Ceri Shipton, Monash University; Janardhana Bora, Karnatak University; Ravi Korisettar, Karnatak University; and Michael Petraglia, University of Oxford. Korisettar and Petraglia are the principal investigators of the archaeological field research in Kurnool, India.

The research was funded by Leverhulme Trust, British Academy, National Science Foundation, Australian Research Council, National Science Foundation Graduate Research Fellowship Program, and Lockheed Martin Corporation.

Image Caption: An example of water-buffalo tracks along the banks of the Jurreru River. Credit: SMU Research

On the Net:

Stethoscope

The stethoscope is an acoustic medical device used for listening the internal sounds of the body. It can be used to listen to lung, heart, and intestines as well as blood flow in the arteries. With a sphygmomanometer it can be used to measure blood pressure. Along with listening to bodies a stethoscope might be employed to listen to automobiles to diagnose any damage as well as used to check scientific vacuum chambers for leaks. A phonendoscope is a stethoscope that intensifies sounds.

In 1816, Rene Laennec, of France, invented the stethoscope. It was composed of a monaural wooden tube. It was similar to an ear trumpet; in fact it was almost indistinguishable from the historical hearing aid device in structure and form. George Cammann perfected the binaural stethoscope design that Arthur Leared invented in 1851. Cammann’s design was the standard until the 1940’s when Rappaport and Sprague designed a new stethoscope. Their design used one side to measure respiratory system and the other is used for cardiovascular system. Hewlett-Packard made the Rappaport-Sprague design.

In the 1960’s Dr. David Littmann created a lighter stethoscope with improved acoustics.

Richard Deslauriers patented the first external noise reducing stethoscope, the DRG Puretone, in 1999. It reduced noise with two steel coils that dissipated outside noise to inaudible heat energy. However, due to the steel added into the stethoscope the weight of the instrument was causing neck strain. Marc Werblud created a lightweight noise canceling stethoscope which improved sound quality and reduced neck strain. Improvements since then have created even lighter weight stethoscopes as well as diaphragms which could be sanitarily changed for each patient.

The stethoscope works by transmission of sound from the chest piece through air-filled hollow tubes to the listener’s ears. There is a bell and a diaphragm on the stethoscope. The bell transmits low frequency sounds and the diaphragm transmits high frequency sounds. The 2-sided stethoscope, invented by Rappaport and Sprague, was created in the 20th century. The main problem with acoustic stethoscopes was that the sound was extremely low. In 1999 the stratified continuous lumen was invented. This solved the problem of the sound being to quiet. Acoustic stethoscope’s are now the most commonly used.

Electronic stethoscopes electronically amplify the sound in order to overcome the low sound problems. They convert acoustic sound waves to electric signals that can be amplified. Electronic stethoscopes can also be wireless, a recording device, provide noise reduction, signal enhancement, as well as both visual and audio output.