redOrbit Staff & Wire Reports – Your Universe Online
Researchers working to decipher the role of genes since the completion of the Human Genome Project may see their efforts accelerated by a new approach that lets them permanently and selectively delete genes from a cell’s DNA.
The new system, known as CRISPR, was developed by researchers at MIT, the Broad Institute, and the Whitehead Institute, and should allow scientists to study the entire genome at once. In two new papers appearing in this week’s online edition of the journal Science, the researchers describe how it is possible to study all the genes in the genome by deleting a different gene in each of a large population of cells, then observing which cells proliferate under different conditions.
“With this work, it is now possible to conduct systematic genetic screens in mammalian cells. This will greatly aid efforts to understand the function of both protein-coding genes as well as noncoding genetic elements,” said David Sabatini of the Whitehead Institute, MIT professor of biology, and senior author of one of the Science papers.
Using this approach, the researchers were able to identify genes that allow melanoma cells to proliferate, as well as genes that confer resistance to certain chemotherapy drugs. Such studies could help scientists develop targeted cancer treatments by revealing the genes that cancer cells depend on to survive, the researchers said.
Feng Zhang, the assistant professor of biomedical engineering at MIT and senior author of the other Science paper, developed the CRISPR system by exploiting a naturally occurring bacterial protein that recognizes and snips viral DNA. This protein, known as Cas9, is recruited by short RNA molecules called guides which bind to the DNA to be cut. This DNA-editing complex offers very precise control over which genes are disrupted, by simply changing the sequence of the RNA guide.
“One of the things we’ve realized is that you can easily reprogram these enzymes with a short nucleic-acid chain. This paper takes advantage of that and shows that you can scale that to large numbers and really sample across the whole genome,” said Zhang.
Zhang and colleagues created a library of about 65,000 guide RNA strands that target nearly every known gene. They then delivered genes for these guides, along with genes for the CRISPR machinery, to human cells. Each cell took up one of the guides, and the gene targeted by that guide was deleted. If the gene lost was necessary for survival, the cell died.
“This is the first work that really introduces so many mutations in a controlled fashion, which really opens a lot of possibilities in functional genomics,” said Ophir Shalem, a Broad Institute postdoc and one of the lead authors of the Zhang paper, along with Broad Institute postdoc Neville Sanjana.
This approach enabled the researchers to identify genes essential to the survival of two populations of cells – cancer cells and pluripotent stem cells. The researchers also identified genes necessary for melanoma cells to survive treatment with the chemotherapy drug vemurafenib.
Separately, Sabatini and his research team targeted a smaller set of about 7,000 genes, but they designed more RNA guide sequences for each gene. The researchers expected that each sequence would block its target gene equally well, but found instead that cells with different guides for the same gene had varying survival rates.
“That suggested that there were intrinsic differences between guide RNA sequences that led to differences in their efficiency at cleaving the genomic DNA,” said Tim Wang, an MIT graduate student in biology and lead author of the paper.
From that data, the researchers deduced some rules that appear to govern the efficiency of the CRISPR-Cas9 system, and used those rules to create an algorithm that could predict the most successful sequences to target a given gene.
“These papers together demonstrate the extraordinary power and versatility of the CRISPR-Cas9 system as a tool for genomewide discovery of the mechanisms underlying mammalian biology,” said Eric Lander, director of the Broad Institute and an MIT professor of biology, a study co-leader with Sabatini.
“And we are just at the beginning: We’re still uncovering the capabilities of this system and its many applications.”
The researchers say that the CRISPR approach could offer a more efficient and reliable alternative to RNA interference (RNAi), which is currently the most widely used method for studying gene functions. RNAi works by delivering short RNA strands known as shRNA that destroy messenger RNA (mRNA), which carries DNA’s instructions to the rest of the cell.
The downside to RNAi is that it targets mRNA and not DNA, so it is impossible to get 100 percent elimination of the gene.
“CRISPR can completely deplete a given protein in a cell, whereas shRNA will reduce the levels but it will never reach complete depletion,” Zhang said.
In future studies, the researchers plan to conduct genome-wide screens of cells that have become cancerous through the loss of tumor suppressor genes such as BRCA1.
If scientists can discover which genes are necessary for those cells to thrive, they may be able to develop drugs that are highly cancer-specific, Wang said.
The strategy could also be helpful in finding treatments that counterattack tumor cells that have developed resistance to existing chemotherapy drugs, by identifying genes that those cells rely on for survival.
The researchers also hope to use the CRISPR system to study the function of the vast majority of the genome that does not code for proteins.
“Only 2 percent of the genome is coding. That’s what these two studies have focused on, that 2 percent, but really there’s that other 98 percent which for a long time has been like dark matter,” Sanjana said.
Researchers Shocked To Discover Hidden DNA Code
[ Watch the Video: Human DNA Has Been Hiding A Secret ]
redOrbit Staff & Wire Reports – Your Universe Online
A newly discovered ‘second code’ hiding within our DNA is casting new light on how changes to DNA impact health and disease, according to a study published Friday in the journal Science. The new code is changing the way scientists read and interpret genetic instructions and mutations.
Since the genetic code was deciphered in the 1960s, scientists have assumed it was used exclusively to write information about proteins. In the current study, researchers at the University of Washington were shocked to discover that genomes actually use the genetic code to write two separate languages – one describing how proteins are made, and the other instructing the cell on how genes are controlled.
One language is written on top of the other, which is why the second language remained hidden for so long, the researchers said.
“For over 40 years we have assumed that DNA changes affecting the genetic code solely impact how proteins are made,” said study leader Dr. John Stamatoyannopoulos, associate professor of genome sciences and of medicine at UW.
“Now we know that this basic assumption about reading the human genome missed half of the picture. These new findings highlight that DNA is an incredibly powerful information storage device, which nature has fully exploited in unexpected ways.”
The genetic code uses a 64-letter alphabet known as codons. The UW team discovered that some codons, which they called duons, can have two meanings – one related to the protein sequence, and one related to gene control. These two meanings seem to have evolved in parallel with each other, with the gene control instructions appearing to help stabilize certain beneficial features of proteins and how they are made.
The discovery of duons has major implications for the way in which scientists and physicians interpret a patient’s genome, and will open new doors to the diagnosis and treatment of disease.
“The fact that the genetic code can simultaneously write two kinds of information means that many DNA changes that appear to alter protein sequences may actually cause disease by disrupting gene control programs or even both mechanisms simultaneously,” Stamatoyannopoulos said.
The work was part of the Encyclopedia of DNA Elements Project, also known as ENCODE, which aims to discover where and how the directions for biological functions are stored in the human genome.
Long Island Barrier Still Intact After Hurricane Sandy
Alan McStravick for redOrbit.com – Your Universe Online
On August 29 of last year, Hurricane Sandy, noted as the most powerful hurricane of the 2012 season, made landfall in southern New Jersey, wreaking havoc across much of the eastern seaboard. Residents of the area faced property destruction and crisis situations like flooding and massive, long-term power outages just in time for a particularly frigid autumn. Now more than a year out from the superstorm, a University of Texas led research team has good news for the region.
In a presentation at this week’s autumn meeting of the American Geophysical Union in San Francisco, the research team reported that Sandy, destructive as she was, was unable to seriously damage the offshore barrier system that controls erosion on Long Island. The good news was mixed with ongoing caveats regarding potential future sea level rise, the pollution of back-barrier estuaries and closer shore damage.
The team used pre-storm survey data compared to post-storm data to arrive at their conclusion. The post-Sandy data was collected as part of a rapid response coordinated among The University of Texas at Austin’s Institute for Geophysics, Adelphi University, Stony Brook University and other regionally-affected institutions.
The post-storm rapid response included marine geophysical surveys of the seafloor and shallow subsurface. These surveys mapped out the sedimentary impact of Hurricane Sandy on the beach and barrier systems of the selected bays, inlets and near-shore areas centered around the south shore of Long Island.
[ Watch the Video: Hurricane Sandy & Rapid Response Science ]
“The shape of the bedforms that make up the barrier system did not change a whole lot,” said co-Principal Investigator (PI) John Goff of the Institute for Geophysics in a statement. “Where we might have expected to see significant erosion based on long-term history, not a lot happened — nothing that ate into the shoreface.”
Stony Brook University, for their part, provided the team with a CHIRP (compressed high-intensity radar pulse) sonar system as well as a high frequency seafloor mapping system. The team, exploring an area approximately one mile offshore, mapped nearly six square miles in total.
“The sand largely took the blow,” added co-PI Jamie Austin of the Institute for Geophysics. “Like a good barricade, the barrier system absorbed the significant blow, but held.”
The Texas-based team has previous experience in this sort of mapping endeavor. In 2008, with Hurricane Ike, their results were not as rosy. That storm, they found, disrupted the natural sedimentary barrier outside the Galveston area, potentially accelerating future erosion of the Texas shore.
Long Island, it turns out, is not Galveston. This is because Long Island has a greater abundance of sand in its overall system. While Hurricane Sandy effectively churned up much of this sand and moved bedforms, the scientists believe the overall quantity of sand in the offshore region helped to maintain the natural barriers shape and integrity.
As mentioned previously, however, the report was not all good news for Long Island. They report the storm brought in new pollutants to the waters just off the Long Island shoreline. Chief among these new pollutants was the detection of heavy metals in a layer of mud deposited offshore. Researcher Beth Christensen of Adelphi University traced their lineage to muds from the South Shore Estuary Reserve. This estuary system has been subjected to pollution from both industry and human habitation over the years.
Less than a year after the storm, the heavy metal concentrations had been naturally dispersed making the toxin level low enough so as not to be an immediate concern, said Christensen.
“But if we continue to see more events like Sandy, we’ll see the introduction of more and more muds from the estuary,” said Christensen, “adding additional toxins to an already stressed system.”
Also of concern is the increase in sea level rise, which is expected to place more pressure on the barrier system. As sea level rises, the onshore impacts from another Sandy-esque storm will go up, states Goff.
“In the long-term, if sea level gets high enough, the barrier system has no choice but to retreat and move landwards,” said Goff, exposing the shoreline to increased erosion. “But at least for the present, there’s no evidence of that being imminent.”
This research represents the sixth rapid response science mission funded by the Jackson School of Geosciences at The University of Texas at Austin. These missions are key in getting the research teams to the affected areas as quickly as possible after a natural disaster. The speed of deployment allows them to quickly measure the vanishing traces of hurricanes, earthquakes, tsunamis and other disasters.
“The faster we get out into the field to measure Earth’s response to naturally destructive events, the better we can relate data to the disasters,” said Austin.
Men’s and Women’s Hearts React Differently to Diabetes Drug
Brett Smith for redOrbit.com – Your Universe Online
In a new study from researchers at at Washington University School of Medicine, the type 2 diabetes drug metformin was found to have different effects on the hearts of men and women, despite managing blood sugar the same in both sexes.
According to the study, which was published in the December issue of the American Journal of Physiology – Heart and Circulatory Physiology, the drug had positive effects on women’s heart health, but male patients saw a change in heart function associated with increased risk for heart failure.
“We saw dramatic sex differences in how the heart responds to the different therapies,” said study author Robert J. Gropler, a professor of radiology at WUStL. “Our study suggests that we need to better define which therapies are optimal for women with diabetes and which ones are optimal for men.”
The pancreas continues to make insulin in patients with type 2 diabetes, but the body isn’t able to effectively use it to draw glucose out of the blood and into the tissues. Type 2 diabetes is also linked to an increased risk for heart failure.
“It is imperative that we gain understanding of diabetes medications and their impact on the heart in order to design optimal treatment regimens for patients,” said study author Dr. Janet B. McGill, also a professor of medicine at WUStL. “This study is a step in that direction.”
In the study, researchers looked at commonly prescribed diabetes drugs in 78 patients. The study participants were divided into one of three groups: those receiving metformin alone, those receiving metformin and rosiglitazone (Avandia) and those receiving metformin plus Lovaza, a form of fish oil.
While metformin reduces glucose production by the liver, rosiglitazone is known to draw free fatty acids out of the blood. Both drugs boost the body’s sensitivity to insulin. Lovaza is prescribed to reduce levels of fatty triglycerides in the blood.
The three groups did not exhibit any major differences in heart metabolism. However, when the patients were divided by sex, the drugs were seen as having different and sometimes opposite effects on heart metabolism.
“The most dramatic difference between men and women is with metformin alone,” Gropler said. “Our data show it to have a favorable effect on cardiac metabolism in women and an unfavorable one in men.”
More specifically, metformin caused men’s hearts to burn less sugar and more fats. According to Gropler, chronic fat burn in the heart leads to negative changes in the heart muscle and potential heart failure.
“Instead of making heart metabolism more normal in men, metformin alone made it worse, looking even more like a diabetic heart,” Gropler said. “But in women, metformin had the desired effect – lowering fat metabolism and increasing glucose uptake by the heart.”
The findings could explain why some trials for diabetes drugs produce conflicting data, the researchers said. Gropler noted his previous research has found differences in how even healthy men’s and women’s hearts metabolize fuel.
“We now know there are sex differences at baseline, both in the metabolism of healthy hearts and in the hearts of patients with diabetes,” Gropler said. “We are adding the message that these sex differences persist in how patients respond to drugs. For patients with diabetes, we are going to have to be more attentive to sex differences when we design therapies.”
Furthermore, the differences in heart metabolism can’t be seen using conventional blood tests, the researchers said.
“This may mean we have to do more complex imaging of the heart to better understand which therapies are best for which patients,” Gropler said.
Brief Laser-Light Treatment May Significantly Improve Effectiveness Of Influenza Vaccines
Pretreatment with near-infrared laser also could improve response to additional intradermal vaccines
Pretreating the site of intradermal vaccination – vaccine delivered into the skin rather than to muscles beneath the skin – with a particular wavelength of laser light may substantially improve vaccine effectiveness without the adverse effects of chemical additives currently used to boost vaccine efficacy. In the open-access journal PLOS ONE, investigators from Vaccine and Immunotherapy Center in the Massachusetts General Hospital (MGH) Division of Infectious Diseases report that a one-minute dose of near-infrared laser light significantly improved the effectiveness of intradermal influenza vaccination in a mouse model – increasing both immune system activity and the animals’ survival.
“We discovered that low-power near-infrared laser light effectively and reproducibly increases vaccine efficacy as well as currently approved adjuvants and is effective for influenza vaccination,” explains Mark Poznansky, MB ChB, PhD, director of the MGH Vaccine and Immunotherapy Center and senior author of the report. “Many of the adjuvants currently in use or in development cause significant side effects – including inflammation and tissue damage – and remarkably few adjuvants that would be likely to receive FDA approval are available for influenza vaccines. Our results indicate that laser treatment would be a safe and effective alternative.”
While current vaccines are designed to be safe for most patients, their ability to produce an immune response needs to be strengthened by the presence of adjuvants, which are chemical or biological additives that prime the immune system to respond to the vaccine antigen. Adjuvants also are responsible for many vaccine-associated adverse events, which is particularly problematic for influenza vaccines. As a result, most flu vaccines recently available in the U.S. – including the vaccine against the H1N1 strain responsible for the 2009 pandemic – contained no adjuvants, probably limiting their effectiveness. Although intradermal vaccines should produce stronger immune protection than conventional vaccines, chemical adjuvants produce strong inflammatory reactions when delivered intradermally, so currently available intradermal flu vaccines contain no adjuvants.
Previous research conducted at the St. Petersburg Military Medicine Academy in Russia and at the Wellman Center for Photomedicine at MGH found that visible laser light enhanced vaccine responses in both humans and mice but did not remove the need for a chemical adjuvant. In addition, visible laser light is absorbed by the skin pigment melanin, reducing its effectiveness in individuals with darkly pigmented skin. Near-infrared light – light with a wavelength just below that of the visible spectrum – is absorbed by water and not melanin, with little change in absorption across the range of human skin color. These features led the MGH team to investigate the potential of near-infrared laser light as an alternative to chemical vaccine adjuvants.
First the researchers determined the maximum near-infrared laser dose that would not cause inflammation or tissue damage in the skin of mice. They then tested that dosage level – about one-tenth the dosage used for FDA-approved applications like hair and tattoo removal – in darkly-pigmented human volunteers, none of whom reported any significant discomfort after up to two minutes’ exposure. Close examination of the treated areas of participants’ skin showed no tissue damage. Another experiment determined that one minute of near-infrared laser treatment was sufficient to increase the generation of antibodies to a protein used as a model vaccine and more than doubled concentrations of dendritic cells – immune cells activated by current adjuvants – in treated areas.
To evaluate the effectiveness of near-infrared laser as an influenza vaccine adjuvant, the researchers pretreated mice with near-infrared laser, visible green-light laser or the commonly used adjuvant alum before administering an intradermal influenza vaccine. The near-infrared laser induced a more complete antibody response to the vaccine than either the visible-light laser or alum without inducing an allergy-associated antibody. Four weeks after receiving vaccine adjuvanted with either near-infrared laser, visible light laser, alum or no adjuvant, mice were infected with a potentially lethal influenza virus. The animals treated with near-infrared laser just prior to vaccination had significantly less virus in their lungs four days after infection and their survival rate was almost as good as those receiving the alum adjuvant. The visible-light laser produced no significant improvement in survival.
“Depending on the particular assay used, near-infrared laser induced a nearly 100-fold increase in the efficacy of influenza vaccination in these animals,” says Satoshi Kashiwagi, MD PhD, of the MGH Vaccine and Immunotherapy Center, lead and co-corresponding author of the PLOS ONE report. “We believe the same approach could be used with other vaccines – such as tuberculosis, polio and malaria – for which intradermal administration is either approved or being evaluated. The effects of the laser adjuvant may last up to six hours, so in the case of mass vaccination programs, pretreatment could be done right before the vaccine is administered.”
Kashiwagi is an instructor in Medicine, and Poznansky is an associate professor of Medicine at Harvard Medical School. The MGH team is currently collaborating on the development of an inexpensive, handheld device to administer the appropriate near-infrared laser dose for vaccine pretreatment and planning a clinical trial of the near-infrared adjuvant to increase the efficacy of hepatitis B vaccination.
Additional co-authors of the PLOS ONE paper include Jeffrey Gelfand, MD, and Timothy Brauns of the MGH Vaccine and Immunotherapy Center and collaborators from Keio University in Japan. The study was supported by grants from the National Institutes of Health, the Defense Advanced Research Projects Agency, the Bill and Melinda Gates Foundation, and the Friends of VIC.
—
On the Net:
Researchers Uncover Mechanism Controlling Tourette Syndrome Tics
A mechanism in the brain which controls tics in children with Tourette Syndrome (TS) has been discovered by scientists at The University of Nottingham.
The study, which has been published in the British Psychological Society’s Journal of Neuropsychology, could herald new non-drug therapies to help young people with TS overcome the repetitive physical movements and vocal sounds which characterise their condition.
The work was funded with a £150,000 grant from the James Tudor Foundation and was carried out by PhD student Amelia Draper.
Professor Stephen Jackson, in the University’s School of Psychology, said: “This new study is very important as it indicates that motor and vocal tics in children may be controlled by brain changes that alter the excitability of brain cells ahead of voluntary movements. You can think of this as a bit like turning the volume down on an over-loud motor system. This is important as it suggests a mechanism that might lead to an effective non-pharmacological therapy for Tourette Syndrome.”
Brain re-structuring
The neurological condition TS affects around one child in every 100 and usually starts during early childhood. Scientists believe that the tics that affect children with TS are caused by faulty wiring in the brain that leads to hyper excitability in the brain regions controlling motor function.
In adolescence, there is a period of ‘pruning back’ in which redundant brain connections are removed and other structural and functional brain changes occur.
During this time, around one-third of children with TS will find that their tics disappear and another third are able to more effectively control their tics. Unfortunately, the remaining third of individuals will see little or no change in their tics and are likely to remain troubled by their TS symptoms into adulthood.
This clinical observation suggests that there are mechanisms in the brain that are involved in controlling tics and undergo development or re-organisation during the teenage years.
Amelia Draper added: “The research is based on the general hypothesis that an area in the brain called the striatum is overactive as a result of alterations in the early development of the brain. As a result, the signals that are relayed to the brain’s cortex region lead to hyper-excitability and cause tics to occur.
“We have looked at how that hyperactivity and the resultant tics might be controlled by finding a way to ‘turn down the volume’ on that ‘cortical excitability’. This is potentially extremely important as the parents of children with tics are desperate to find a safe and effective therapy that is an alternative to drug treatments.”
Unwanted movements
In the current study the team used a method called Transcranial Magnetic Stimulation (TMS) in which a magnetic field is passed over the brain to produce a weak electrical current which stimulates motor function to induce a twitch response.
By delivering TMS at different points in time as participants were about to undertake a hand movement, the researchers were able to measure alterations in brain excitability ahead of the movement and chart the differences between each person.
The study showed that subjects with TS, unlike those of a similar age without the condition, were least able to modulate the hyperactivity in the brain.
Professor Jackson said: “If there is a relationship between this cortical excitability or hyperactivity and tics then this is really important as it means that there may be something that we might be able to do to help children with TS to better control these unwanted movements.”
Further research by the team has involved the use of a similar type of brain stimulation called transcranial direct current stimulation (TDCS) to study the brains of children with TS. Early results suggest that TDS can be applied to decrease neuronal excitability and this may be effective in suppressing tics for extended periods. In addition, if another form of TDCS is applied, one that increases neuronal excitability, it may act to improve learning and memory function, particularly in the context of behavioural therapies. Following use of these treatments lasting effects can be applied to the brain.
Effective and longer lasting
If proven to be effective, the technology could be adapted into a TENS machine-style device that would offer a cheap, portable and individualized therapy for children with TS.
Professor Jackson added: “For the one-third of people who aren’t going to get better this could offer them a much needed assistance with controlling their tics, while relying less on other conventional pharmaceutical therapies which can have associated side effects such as weight gain or tiredness.
“It can be applied at home while the child is watching TV or eating their cornflakes so it would reduce the amount of school they would miss and potentially we can use the TDCS to both control the tics and make that control more effective and longer lasting.”
As part of her work Amelia Draper is also using MRI scanning technology to examine the potential relationship between cortical excitability and a brain chemical that appears to be strongly linked to neuronal excitability in TS.
Rod Shaw, Chief Executive of the James Tudor Foundation, said: “We’re glad to see that the funding we have given to this project is producing some interesting and potentially useful results.”
Professor Jackson’s research is a key project within the University’s appeal, Impact: The Nottingham Campaign, which is delivering the University’s vision to change lives, tackle global issues and shape the future. Find out more about our research and how you can support us at http://tiny.cc/UoNImpact.
—
On the Net:
Clay-Like Substance Found On Jupiter’s Moon Europa
Lee Rannals for redOrbit.com – Your Universe Online
Jupiter’s icy moon Europa appears to have clay-type minerals on its surface, according to new data from NASA’s Galileo mission. Scientists at the American Geophysical Union meeting in San Francisco say they have identified clay-like minerals on Europa for the first time. This finding could also imply that Europa carries organic materials.
“Organic materials, which are important building blocks for life, are often found in comets and primitive asteroids,” said Jim Shirley, a research scientist at NASA’s Jet Propulsion Laboratory, Pasadena, California. “Finding the rocky residues of this comet crash on Europa’s surface may open up a new chapter in the story of the search for life on Europa.”
Scientists believe that Europa may be one of the best locations in our solar system to find extraterrestrial life. The moon has a subsurface ocean in contact with rock, an icy surface that mixes with the ocean below, salts on the surface that create an energy gradient, and a source of heat.
Researchers analyzing Galileo data spotted the clay-type minerals known as phyllosilicates in near-infrared images from Galileo taken in 1998. These images are low resolution by today’s standards, so the team is applying a new technique for pulling a stronger signal for these materials out of the noisy picture.
The phyllosilicates appear in a ring about 25 miles wide, about 75 miles away from the center of a crater site. The leading explanation for this pattern is the splash back of material ejected when a comet or asteroid hit the surface of Europa at an angle of 45 degrees or more from the vertical direction.
A shallow angle would allow some of the space object’s original material to fall back to the surface. A more head-on collision would have vaporized the material or driven the clay below the moon’s icy surface. The other explanation for the phyllosilicates is that the material made its way up from Europa’s interior. However, NASA said that this scenario is unlikely because scientists believe the phyllosilicates would have had to travel up 60 miles in some areas.
The scientists said that if the body that brought the phyllosilicates to Europa was an asteroid then it was about 3,600 feet in diameter. If the culprit was a comet then it was likely about 5,600 feet in diameter, which is roughly about the same size comet ISON was before losing in a battle with the sun a few weeks ago.
“Understanding Europa’s composition is key to deciphering its history and its potential habitability,” Bob Pappalardo of JPL, the pre-project scientist for a proposed mission to Europa, said in a statement. “It will take a future spacecraft mission to Europa to pin down the specifics of its chemistry and the implications for this moon hosting life.”
Computer Algorithm May Soon Be Picking Hipsters Out Of The Crowd
Lee Rannals for redOrbit.com – Your Universe Online
Computer scientists are developing an algorithm to help end the debate on whether or not someone is a hipster.
Calling someone a hipster can be either an insult or a compliment depending on the person who is on the receiving end. The term has had a very broad definition and has been the subject of debate, with some claiming it to be a clothing style and others a lifestyle. However, computer scientists at the University of California, San Diego are seeking to end the debate with an algorithm that determines whether someone is a hipster, a surfer or a biker.
So far, the algorithm has been able to correctly identify a social group 48 percent of the time, which the researchers say is actually a very good number.
“This is a first step,” Serge Belongie, a computer science professor at the Jacobs School of Engineering at UCSD and co-author of the study, said in a statement. “We are scratching the surface to figure out what the signals are.”
The researchers hoped their algorithm could make it easier to pick up social cues, like clothing and hairdos, to determine people’s urban tribes based on visuals featuring more than one person. This technology could be used in applications like generating more relevant search results and ads.
The algorithm segments each person in six sections, including face, head, top of head, neck, torso and arms. This method is an example of what is better known as a “parts and attributes” approach. The team designed the algorithm to analyze the picture as the sum of its parts and attributes.
Researchers fed the algorithm pictures labeled for the urban tribes they represent, and then fed the algorithm pictures without labels. The computer program was able to predict a social group more accurately than just a random setting. The next step for the team is to run the same set of pictures by human users and see how they perform and compare the two results.
The computer scientists said they referred to Wikipedia to help define the urban tribes. They selected the eight most popular categories in the online encyclopedia’s list of subcultures, which also included country, Goth, heavy metal, hip hop and raver. The team gathered photographs from three common categories for social venues, such as formal events, dance clubs and casual pubs.
The team said they plan to make the collection of photographs gathered available to other research groups who are interested in studying urban tribes.
FDA Aims To Curb Antibiotic Use In Animal Food Production
Brett Smith for redOrbit.com – Your Universe Online
The Food and Drug Administration (FDA) has issued guidelines calling on international pharmaceutical companies to phase out the production of antibiotics designed for use in animal production facilities. These drugs are often added to chicken or cow feed not to cure a sick animal, but to help them gain weight faster – a side-effect of the medications.
In a statement, the FDA cited the growing concern of antibiotic resistant bacteria as the main reason behind the decision.
“We need to be selective about the drugs we use in animals and when we use them,” said William Flynn, the deputy director for science policy at FDA’s Center for Veterinary Medicine (CVM). “Antimicrobial resistance may not be completely preventable, but we need to do what we can to slow it down.”
In the guideline released on Wednesday, the federal agency asked drug and animal health companies to voluntarily remove references to usability in animal production from medically-important antibiotics. According to the FDA, once the change is made – the products in question can only be used in food-producing animals to treat, avoid or control disease under the order of a licensed veterinarian.
“This action promotes the judicious use of important antimicrobials, which protects public health and, at the same time, ensures that sick and at-risk animals receive the therapy they need,” said Bernadette Dunham, a director at CVM. “We realize that these steps represent changes for veterinarians and animal producers, and we have been working to make this transition as seamless as possible.”
Rep. Louise Slaughter, D-NY, called the FDA’s recommendation on antibiotics inadequate, “with no mechanism for enforcement and no metric for success.”
Steven Roach, senior analyst with advocacy group Keep Antibiotics Working, agreed.
“Our fear … is that there will be no reduction in antibiotic use as companies will either ignore the plan altogether or simply switch from using antibiotics for routine growth promotion to using the same antibiotics for routine disease prevention,” he told Reuters.
According to Reuters, the FDA has already received support from drugmakers Zoetis and Elanco, both of which sell a large percentage of the products being targeted by the guidelines. Elanco released a statement saying it would voluntarily restrict the use of antibiotics used to treat both humans and animals “only to therapeutic purposes of treating, controlling and preventing diseases in animals under the supervision of a veterinarian.”
The first steps toward the new recommendations came in 2010 when the FDA drafted a proposal that called for the phasing out the use of medically important antibiotics in food production. The new guidelines lay out a strategy for achieving that stated goal and mark the start of the formal implementation period.
As part of the new guidelines, the agency is asking animal drug companies to notify the FDA within the next three months of their intent to comply with the voluntary regulations. Companies would then have three years to completely implement these changes.
“Based on our outreach, we have every reason to believe that animal pharmaceutical companies will support us in this effort,” said Michael R. Taylor, FDA’s deputy commissioner for foods and veterinary medicine.
Use Of CPAP For Sleep Apnea Reduces Blood Pressure For Patients With Difficult To Treat Hypertension
Among patients with obstructive sleep apnea and hypertension that requires 3 or more medications to control, continuous positive airway pressure (CPAP) treatment for 12 weeks resulted in a decrease in 24-hour average and diastolic blood pressure and an improvement in the nocturnal blood pressure pattern, compared to patients who did not receive CPAP, according to a study appearing in the December 11issue of JAMA.
“Systemic hypertension is one of the most treatable cardiovascular risk factors. Between 12 percent and 27 percent of all hypertensive patients require at least 3 antihypertensive drugs for adequate blood pressure control and are considered patients with resistant hypertension. Patients with resistant hypertension are almost 50 percent more likely to experience a cardiovascular event than hypertensive patients without resistant hypertension, and the incidence of resistant hypertension is expected to increase,” according to background information in the article. Recent studies have shown that obstructive sleep apnea [OSA] may contribute to poor control of blood pressure and that a very high percentage (>70 percent) of resistant hypertension patients have OSA. Continuous positive airway pressure is the treatment of choice for severe or symptomatic OSA. “A meta-analysis suggests that CPAP treatment reduces blood pressure levels to a clinically meaningful degree, but whether this positive effect is more pronounced in patients with resistant hypertension is unclear because studies on this issue are scarce and based on single-center approaches.”
Miguel-Angel Martinez-Garcia, M.D., Ph.D., of the Hospital Universitario y Politecnico La Fe, Valencia, Spain, and colleagues assessed the effect of CPAP treatment on blood pressure levels and nocturnal blood pressure patterns of 194 patients with resistant hypertension and OSA. The trial was conducted in 24 teaching hospitals in Spain; data were collected from June 2009 to October 2011. The patients were randomly assigned to receive CPAP (n = 98) or no CPAP (control; n = 96) while maintaining usual blood pressure control medication.
When the changes in blood pressure during the study period were compared between study groups by intention-to-treat, the CPAP group achieved a 3.1 mm Hg greater decrease in 24-hour average blood pressure and 3.2 mm Hg greater decrease in 24-hour diastolic blood pressure, but the difference in change in 24-hour systolic blood pressure was not statistically significant compared to the control group. In addition, the percentage of patients displaying a nocturnal blood pressure dipper pattern (a decrease of at least 10 percent in the average night-time blood pressure compared with the average daytime blood pressure) at the 12-week follow-up was greater in the CPAP group than in the control group (35.9 percent vs. 21.6 percent). There was a positive correlation between hours of CPAP use and the decrease in 24-hour average blood pressure.
“Further research is warranted to assess longer-term health outcomes,” the authors conclude.
—
On the Net:
Regular Milk Can’t Beat Organic When It Comes To Heart-Healthy Fats
[ Watch the Video: Researchers Tout The Effects Of Organic Milk ]
Lawrence LeBlond for redOrbit.com – Your Universe Online
In promoting the health benefits of omega-3 fatty acids, a new study led by Washington State University has recently found that organic milk contains a significantly higher amount of the heart-healthy fats than regular milk from cows on conventional dairy farms.
While all milk offers heart-healthy omega-3 fatty acids, the new research concludes that organic whole milk is the best source of this essential nutrient. Additionally, organic milk may be a better choice due to the fact that conventional milk has an average omega-6 to omega-3 fatty acid ratio of 5.8, which is more than double that of organic milk’s 2.3.
According to a National Institutes of Health factsheet on omega-3 fatty acids, “Most American diets provide more than 10 times as much omega-6 than omega-3 fatty acids. There is general agreement that individuals should consume more omega-3 and less omega-6 fatty acids to promote good health.”
While Omega-6 fatty acids are important in the diet to some extent, in larger amounts they are associated with a variety of health problems, including cardiovascular disease, cancer, inflammation and autoimmune diseases. The higher the ratio of omega-6 to omega-3, the greater the likelihood of developing disease.
The WSU study is the first large-scale, US-wide comparison of organic and conventional milk. The team tested about 400 samples of both types of dairy product over an 18-month period to come to their conclusions.
[ Watch the Video: Added Nutritional Benefits of Organic Milk ]
Much research has been published showing that foraging on grass and legumes promotes cow health and improves the fatty acid profile in organic dairy products.
Charles Benbrook, the study’s lead author from WSU, said that, even with all the research proof, he and his team were “surprised by the magnitude of the nutritional quality differences we documented in this study.”
Most western diets have an omega-6 to omega-3 ratio of between 10-to-1 and 15-to-1. The 2.3-to-1 ratio found in organic milk is believed to maximize heart health. Benbrook and colleagues modeled a hypothetical diet for adult women with a baseline ratio of 11.3-to-1 and looked at how far three interventions could go in reducing that ratio to 2.3.
They found that by switching from three daily servings of conventional dairy products to 4.5 servings of full-fat organic dairy products, adult women could achieve nearly 40 percent of that nine-point needed ratio drop in their daily diets. Additionally, by avoiding a few foods each day considered high in omega-6 fatty acids, women could lower their fatty acid ratio to about 4-to-1, about 80 percent of the way to the 2.3-to-1 ratio goal.
“Surprisingly simple food choices can lead to much better levels of the healthier fats we see in organic milk,” said Benbrook.
In a blog coinciding with the research, which was published in the Dec. 9 online issue of PLoS ONE, Benbrook noted that there is “no magic number or universal agreement on the optimal omega-6/omega-3 ratio in the human diet” so the team went with the often-noted target of 2.3-to-1.
“Our findings would not have differed much had we chosen the less ambitious goal of 5. We quantified the progress made as a result of the dietary interventions in terms of percent progress from the baseline ratio of 11.3 to the heart-health target of 2.3 (i.e., a total drop of 9 points would be desirable in the value of the omega-6/omega-3 ratio),” he added.
Benbrook said it is likely that his team’s research will trigger some discussion and debate. Among the likely topics: “The current balance of fatty acid intakes in the American diet, the roll [sic] of full-fat milk and dairy products in health promotion and the development of infants and children, and steps consumers can take to progress toward a healthier mix of fats in their diet.”
Organic milk analysis for Benbrook’s study came from cows managed by farmer-owners of the Cooperative Regions of Organic Producer Pools (CROPP), which markets through the Organic Valley brand. These two organizations also helped fund the study but had no role in its design or analysis. The study also received funding from the Measure to Manage program in the Center for Sustaining Agriculture and Natural Resources at Washington State University.
NASA Builds GPS-Based System For Detecting Natural Disasters
April Flowers for redOrbit.com – Your Universe Online
Existing GPS technologies have been enhanced by scientists at NASA’s Jet Propulsion Laboratory (JPL) and Scripps Institution of Oceanography at UC San Diego to develop new systems for California and elsewhere to provide warning of hazards such as earthquakes, tsunamis and extreme weather events.
Forecasters at NOAA National Weather Service offices in Oxnard and San Diego, California demonstrated the new technology in July, using it to track a summer monsoon rain affecting Southern California and issue more accurate and timely flash flood warnings. The new technology uses real-time information from GPS stations that have been upgraded with small, inexpensive seismic and meteorological sensors.
Other real-world systems are integrating the new technology as well. For example, it is being used to make damage assessments for hospitals, bridges and other critical infrastructure that can be used in real time by emergency personnel, decision makers and first responders to help mitigate threats to public safety.
The primary goal for hospitals is to shut down elevators automatically and send alerts to operating room personnel in the event of, for example, an earthquake early warning. The earthquake early warning system is particularly effective during large events. The system could be used to detect changes in the structure of bridges due to earthquakes, wind shear and traffic loads, as well.
The implications and possible applications of the new technology were discussed by scientists from JPL and Scripps at the American Geophysical Union meeting this past week.
“These advancements in monitoring are being applied to public safety threats, from tall buildings and bridges to hospitals in regions of risk for natural hazards,” said Yehuda Bock of Scripps Institution of Oceanography. “Meaningful warnings can save lives when issued within one to two minutes of a destructive earthquake, several tens of minutes for tsunamis, possibly an hour or more for flash floods, and several days or more for extreme winter storms.”
An optimal combination of GPS, accelerometer, pressure and temperature data is the basis for the new technology. This data is collected in real time at many locations throughout Southern California and on large engineered structures—like tall buildings, hospitals and bridges—for focused studies of health and damage. The technology returns data products such as accurate measurements of permanent motions (displacements) of ground stations and instruments deployed on structures, which form the basis for early detection of sustained damage; and measurements of precipitable water in the lower atmosphere, a determining factor in short-term weather forecasting. The combination of sensors significantly improves current seismic and meteorological practices.
NOAA’s Earth System Research Laboratory used a regional collaborative network of GPS stations—newly expanded to provide dense coverage in Southern California—to provide atmospheric moisture measurements to forecasters in the case of the first successful Southern California monsoon forecast and more accurate flash flood warnings in July.
Hundreds of scientific-grade GPS stations throughout Southern California are constantly receiving signals from GPS satellites to determine their precise positions. GPS ground stations are simultaneously measuring water vapor as well as position because water vapor in the atmosphere distorts GPS satellite signals.
“These water vapor measurements are currently being used to help forecasters better monitor developing weather during periods between satellite overpasses and weather balloon launches,” said research scientist Angelyn Moore of JPL. “Our project is upgrading GPS ground stations to get these data to forecasters in minutes to seconds to help them better understand whether summer monsoonal moisture is likely to cause harmful flash flooding.”
“This GPS network provides forecasters with timely and critical information on the availability of atmospheric moisture, allowing us to more accurately forecast and warn for potentially deadly flash flooding and wintertime heavy precipitation events in Southern California,” said Mark Jackson, meteorologist in charge at NOAA’s National Weather Service office in Oxnard.
“Having such detailed and timely information on how much moisture is available helps us better understand and forecast our extreme winter storms fueled by what are known as atmospheric rivers. It can also help us better pinpoint and anticipate thunderstorms capable of producing flash flooding.” Weather forecasters in Southern California are moving from periodic updates of moisture content once every 30 minutes to continuous updates. Balloon launches, from four locations, occur only twice a day.
According to Bock, the technology improves earthquake early warning by analyzing the very first moments of an earthquake in real time to characterize the more violent shaking that will follow. It is possible to predict the arrival of slower-traveling seismic “S” (secondary) waves that cause the most intense shaking by detecting the initial arrival of seismic ‘P’ (primary) waves, which travel through Earth the fastest, at the upgraded GPS stations.
Depending on distance from the earthquake’s epicenter, the warning time can range between several seconds to as long as two minutes. Critical fault parameters, such as earthquake magnitude, can be rapidly and accurately determined to generate ground intensity maps throughout the affected region, and form the basis of tsunami warnings.
The scientists are planning to integrate the technology into earthquake and tsunami early warnings and structural monitoring for the San Diego County Office of Emergency Service. Other institutions are examining the applications of the technology as well, such as hospital monitoring and early warnings for UC San Diego Medical Center in Hillcrest; monitoring of the Vincent Thomas Bridge in Long Beach for Caltrans; and forecasts of storms and flooding for NOAA’s weather forecasting offices in San Diego and Los Angeles.
Volcanic Eruption’s Green Lightning Explained By Atmospheric Scientist
redOrbit Staff & Wire Reports – Your Universe Online
Mysterious green lightning, seen emerging from an ash cloud in images of a May 2008 volcanic eruption in Chile, is likely more common than we realize, according to research presented Monday during the annual fall meeting of the American Geophysical Union in San Francisco.
Green lightning had not really been observed before photographer Carlos Gutierrez snapped photographs of the Chaiten volcano eruption, according to Olive Heffernan of National Geographic. The origin of the unusual phenomenon was unknown until Arthur Few, an atmospheric scientist and professor emeritus at Rice University, began investigating the unusual climatic event.
“I thought, ‘That’s funny; why don’t we see this in lightning storms?’” Few said during the conference. Few believes that the phenomenon is not all that unusual, said Larry O’Hanlon of Discovery News. He believes that it is more common than people realize, but is most likely hidden inside of regular thunderstorms.
“The concealment results from the structure of storm clouds,” Heffernan said. “On the inside, the clouds contain ice crystals that are either positively or negatively charged. Surges of electricity occur between positively and negatively charged regions within the cloud – lightning – but they remain inside, unseen by even the most committed storm chasers. In contrast, volcanic ash clouds carry their electrical charges on the outside, where they are sparked by fragments of rock forcefully ejected into the air during an eruption.”
What Few doesn’t understand, and is currently attempting to find out, is why the ash column produced during the Chilean eruption produced visible green lighting. For answers, he looked at other atmospheric phenomena which appear to be green, such as the northern lights. The aurora glows green, red and white when their oxygen atoms are excited by electrons originating from space. Above 100 km, the northern lights appear to be green.
“My working hypothesis is that the green ones are actually streamers,” Few explained. Streamers are lightning bolts which are effectively “a positive channel being pulled to a negative charge” occurring higher up in the ash cloud. Essentially, the white lightning visible in the Chaiten pictures is a negative charge generating throughout the cloud and curving into its bottom, while the green lighting is the positive streamer that reverses the flow, O’Hanlon said.
Mars Surface Radiation Levels Almost Suitable For Manned Mission
Brett Smith for redOrbit.com – Your Universe Online
Newly released data from the Curiosity rover indicate that Martian radiation levels are very close to being acceptable for future manned missions to the surface of the Red Planet.
While the Curiosity rover spent its first 300 days cruising around the planet’s Gale Crater gathering soil samples and examining rock structures, the onboard Radiation Assessment Detector (RAD) was making detailed recording of the radiation found on the Martian surface.
“Our measurements provide crucial information for human missions to Mars,” said Don Hassler, a Southwest Research Institute (SwRI) program director and RAD principal investigator. “We’re continuing to monitor the radiation environment, and seeing the effects of major solar storms on the surface and at different times in the solar cycle will give additional important data. Our measurements also tie into Curiosity’s investigations about habitability.”
“The radiation sources that are of concern for human health also affect microbial survival as well as the preservation of organic chemicals,” Hassler added.
The Curiosity data indicates a 5 percent increase in fatal cancer risk for a round-trip mission. NASA has essentially established a 3 percent increased risk of fatal cancer as a suitable career limit for astronauts working in low earth orbit. However, the space agency does not currently have a maximum value for deep space missions, and it is working with the National Academies Institute of Medicine to calculate appropriate limits for deep space missions.
Two types of radiation potentially threaten astronauts on Mars: a constant low dose of galactic cosmic rays (GCRs) and the potential short-term exposures to solar energetic particles (SEPs) from solar flares and coronal mass ejections. The lack of a global magnetic field on Mars and the thin Martian atmosphere offer very little protection from this radiation, compared to Earth.
Both GCRs and SEPs react within the atmosphere and, if strong enough, pierce down into the Martian soil, or regolith. There they produce secondary particles that add to the multifaceted radiation environment on the Martian surface.
“The RAD surface radiation data show an average GCR dose equivalent rate of 0.67 millisieverts (mSv) per day from August 2012 to June 2013 on the Martian surface,” Hassler said. “In comparison, RAD data show an average GCR dose equivalent rate of 1.8 millisieverts per day on the journey to Mars, when RAD measured the radiation inside the spaceship.”
This means that the highest exposure level for a manned Mars mission will be during travel between planets, when the astronauts will only be shielded by the spacecraft. During just the interplanetary travel phases of a Mars mission, the total radiation dose would be about 660 mSv for a round trip, based on calculations using current propulsion technology and average solar activity, NASA said.
A 500-day stay on the surface would bring the total dose to around 1000 mSv, or 1 Sv, which is associated with a five percent increase in fatal cancer risk. In comparison, the average CT abdominal scan has a radiation exposure level of less than 10 mSv.
NASA scientists published the results of their analysis on the radiation data this week in the journal Science.
Drug-Releasing Contact Lens Effectively Treats Glaucoma
Ranjini Raghunath for redOrbit.com – Your Universe Online
A contact lens that slowly releases drugs into the eye to treat glaucoma has been developed by researchers at MIT, Boston Children’s Hospital and Harvard Medical School. The lenses provide a sort of ‘hands free’ alternative to medicated eye drops or more expensive treatments such as laser surgery for treating glaucoma.
Glaucomas are eye disorders arising from build-up of pressure on the optic nerve which acts as a bridge between the eye and the brain. Glaucomas affect more than 60 million people worldwide. Unless treated, they can lead to permanent blindness. There is currently no cure, but early diagnosis and treatment can control or prevent blindness in patients.
Taking eye drops containing drugs to reduce fluid build-up is the most common and efficient treatment method, but it has some drawbacks. Only a fraction of the drug actually gets absorbed; the rest enters the nasal passage or spills over, causing irritation to the skin. Sometimes patients simply forget to take their eye drops regularly. Overall, less than 50 percent of patients who use eye drops stick to it, the researchers wrote in their paper, which appeared in the journal Biomaterials.
For fifty years now, contact lenses that release drugs periodically into the eyes have been explored as an alternative treatment option. However, those developed so far can only release the drug for a few hours after first use.
The newly developed contact lens, on the other hand, can deliver large quantities of the drug constantly for at least four weeks – the longest any lens has been able to do so far, the researchers wrote. The lens eliminates the need to keep track of and take eye drops daily, and needs to be changed only once a month.
The lens consists of a thin film of FDA-approved polymer and latanoprost – the drug used in eye drops – entrapped along the sides of a regular contact lens, enabling controlled release.
For at least a month, the lenses were able to release the drug at amounts similar to that taken by the patient daily in the form of eye-drops. The researchers also tested for any toxic or allergic reactions arising from breakdown of the lens material or the drug and found none. Neither the lenses nor the drug harmed cells grown in the lab or animals, they wrote.
The lenses can be made to custom specifications commonly used for correcting short or long sight. They can also be tailor-made to release antibiotics or drugs used for other eye infections, the researchers believe.
“A non-invasive method of sustained ocular drug delivery could help patients adhere to the therapy necessary to maintain vision in diseases like glaucoma, saving millions from preventable blindness,” Joseph Ciolino, Massachusetts eye specialist and first author, said in a statement.
Scientists Discover Gene That May Predict Response To Antidepressants
redOrbit Staff & Wire Reports – Your Universe Online
Treating depression using selective serotonin reuptake inhibitors (SSRIs) is often a trial-and-error process, but new research appearing in the journal Translational Psychiatry lays the foundation for a new genetic test that could allow doctors to provide patients with a personalized treatment program.
Researchers at Tel Aviv University (TAU) explained that SSRIs do not work for everyone, even though they are the most commonly prescribed form of antidepressants. Patients are often asked to start using one type of SSRI, with its own unique set of side effects, for three to four weeks to see if it is effective.
However, the study authors report that they have discovered a gene which could indicate whether or not one type of SSRI medication will be effective for a certain patient. Provided this biomarker is validated through clinical trials, it could be used to create a genetic test that could lead to an individualized treatment approach, they said.
“SSRIs only work for about 60 percent of people with depression,” Dr. David Gurwitz of the Department of Molecular Genetics and Biochemistry at TAU’s Sackler Faculty of Medicine, said in a statement Monday. “A drug from other families of antidepressants could be effective for some of the others. We are working to move the treatment of depression from a trial-and-error approach to a best-fit, personalized regimen.”
According to the university, over 20 million Americans annually are diagnosed with debilitating depression which requires clinical intervention. SSRIs (which include medications such as Zoloft and Prozac) are among the newest and most popular forms of treatment for the condition, and are believed to work by blocking the reabsorption of the neurotransmitter serotonin in the brain and helping to boost a person’s overall mood.
In order to locate the genes potentially responsible for the brain’s responsiveness to these drugs, Dr. Gurwitz and his colleagues applied the SSRI Paxil to 80 cell lines from the National Laboratory for the Genetics of Israeli Populations (NLGIP), a genetic data biobank located at the Sackler Faculty of Medicine.
“The TAU researchers then analyzed and compared the RNA profiles of the most and least responsive cell lines,” the university said. “A gene called CHL1 was produced at lower levels in the most responsive cell lines and at higher levels in the least responsive cell lines. Using a simple genetic test, doctors could one day use CHL1 as a biomarker to determine whether or not to prescribe SSRIs.”
“We want to end up with a blood test that will allow us to tell a patient which drug is best for him. We are at the early stages, working on the cellular level. Next comes testing on animals and people,” added TAU doctoral student Keren Oved, who led the research along with fellow student Ayelet Morag.
They also set out to determine why CHL1 levels could predict whether or not a person would respond to SSRIs, so they applied Paxil to human cell lines for three weeks – the time required to achieve a clinical response. They discovered that the drug increased production of ITGB3, a gene which is believed to interact with CHL1 in order to encourage the development of new neurons and synapses, according to the researchers.
“The result is the repair of dysfunctional signaling in brain regions controlling mood, which may explain the action of SSRI antidepressants,” the university said. “This explanation differs from the conventional theory that SSRIs directly relieve depression by inhibiting the reabsorption of the neurotransmitter serotonin in the brain… The TAU researchers are working to confirm their findings on the molecular level and with animal models.”
WWII-Era Pacific Munitions Dump Site Found To Be Chemical-Free
April Flowers for redOrbit.com – Your Universe Online
Between San Francisco and the Mexican border, US nautical charts have shown seven “chemical munitions dumping areas” along the Pacific Coast since World War II. Little to no information is available, however, about the amount, location or nature of the materials dumped at most of these sites.
Researchers from the Monterey Bay Aquarium Research Institute (MBARI) described a survey of one supposed deep-water dump site off the coast of Southern California at the American Geophysical Union (AGU) Conference earlier this week. The survey found 55-gallon drums and trash, but no chemical munitions. The findings suggest that not all marked sites contain chemical munitions, and also demonstrate the usefulness of underwater robots in surveying such sites to identify areas of concern.
In nautical charts of the US waters, there are a total of 32 chemical munitions dumping areas, seven of which lie between San Francisco and the Mexican border along the California coastline. Some of the marked areas off the coast of California are huge, encompassing more than 1500 square miles of seafloor. Only the area off the coast of San Francisco has been studied in any detail.
MBARI chemical oceanographer Peter Brewer is concerned by the lack of available information. Chemical munitions dumped at these sites could pose hazards to fishers and researchers studying the seafloor. Hundreds of fishermen in Japan, the Baltic Sea, and off the east coast of the United States have been injured by chemical munitions caught in their nets over the last 50 years. Brewer suspects, however, that some of the sites off the coast of California may not contain such munitions. Some of the sites might contain munitions, but the affected areas are likely to be much smaller than the marked off areas of the charts.
To perform a preliminary survey of a marked dump site in the Santa Cruz Basin, about 70 miles southwest of Los Angeles, Brewer and his team used two different types of underwater robots. The site Brewer and colleagues studied is approximately 6,300 feet deep.
MBARI’s seafloor‐mapping autonomous underwater vehicle (AUV) spent 18 hours in March 2013, surveying a portion of the Santa Cruz Basin using side‐scan sonar. The AUV followed a preprogrammed zig-zag path approximately 82 feet above the ocean bottom, surveying almost 10 square miles of seafloor. The survey site included areas both inside and outside the marked dump site. The researchers counted 754 “targets,” or objects sticking up from the seafloor, within the surveyed areas.
The AUV sonar surveys allowed the research team to locate hard objects on the bottom of the ocean, however, they did not provide enough detail so that these objects could be positively identified. In May 2013, Brewer and his team returned to the Santa Cruz Basin to videotape the seafloor using one of MBARI’s remotely operated vehicles, the ROV Doc Ricketts.
The ROV captured video showing 55-gallon drums in and on the muddy seafloor. The majority of these rusting barrels were covered with anemones, sponges, crabs and other animals. Other targets discovered by the AUV turned out to be garbage such as canned goods and cases of bottled water. Two small, unarmed drones used by the military for target practice, along with a 98-foot long steel mast from a ship, were also found. The ROV survey found no chemical weapons.
The findings of this partial survey suggest that not all sites marked as chemical munitions dumps may actually have been used for this purpose. The work does demonstrate the usefulness of modern undersea robots, such as MBARI’s seafloor‐mapping AUV, for surveying such marked dump areas relatively quickly. Cartographers will be able to redraw the lines around these areas, based on such surveys, to more accurately reflect what’s on the seafloor.
Mapping The Dinosaur-Killing Yucatan Peninsula Asteroid Impact Site
April Flowers for redOrbit.com – Your Universe Online
An asteroid or comet crashed into a shallow sea near what is now the Yucatan Peninsula of Mexico approximately 65 million years ago. A firestorm and global dust cloud resulted, causing the extinction of many land plants and large animals, including most of the dinosaurs.
Researchers from the Monterey Bay Aquarium Research Institute (MBARI) presented evidence this week at a meeting of the American Geophysical Union (AGU) that remnants from this devastating impact are exposed along the Campeche Escarpment—an immense underwater cliff in the southern Gulf of Mexico.
The impact from this ancient meteorite created a crater over 99 miles across. The crater is buried beneath hundreds of feet of debris and at least half a mile of marine sediments, making it nearly invisible for modern geologists, however. The fallout from the impact has been found in rocks around the globe, although there has been surprisingly little research done on the rocks close to the impact site, partially because they are so deeply buried. Currently existing samples of impact deposits close to the crater have come from deep boreholes drilled on the Yucatan Peninsula.
Led by MBARI scientist Charlie Paull, an international team of scientists created the first detailed map of the Campeche Escarpment in March 2013 using multi-beam sonars on the research vessel Falkor, operated by the Schmidt Ocean Institute. Google Maps and Google Earth have recently incorporated the new maps for viewing by researchers and the general public.
Just northwest of the Yucatan Peninsula is the Campeche Escarpment, a 372-mile long underwater cliff. Paull has long suspected that rocks associated with the impact crater might be exposed along this nearly 13,000-foot tall cliff, which is one of the steepest and tallest underwater features on Earth. Except that the Escarpment is thousands of feet below the sea, it is comparable to one wall of the Grand Canyon.
Sedimentary rock layers are exposed on the face of the Campeche Escarpment, much like on the walls of the Grand Canyon. These layers provide a sequential record of the events that have occurred over millions of years. The new maps suggest that rocks formed before, during and after the impact are all exposed along different areas of the cliff face.
Paull hopes that one day, geologists will be able to perform geologic fieldwork — such as collecting samples along the Escarpment — just as they can map layers of rock walking the Grand Canyon. Performing large-scale geological surveys thousands of feet below the ocean surface would have seemed a distant fantasy just a few decades ago. Such mapping has become almost routine for MBARI geologists in the last eight years, with the use of underwater robots.
A new chapter in research about one of the largest extinction events in the history of our planet could be opened up by the new maps and the data they represent. Researchers from MBARI and other institutions are already using these maps to plan further studies in this little-known area. Fascinating new clues about what happened during the massive impact event that ended the reign of the dinosaurs will be revealed through detailed analysis of the bathymetric data and eventual fieldwork on the escarpment – clues that have been hidden beneath the ocean for 65 million years.
Viagra Found To Alleviate Menstrual Pain Without Side Effects
redOrbit Staff & Wire Reports – Your Universe Online
While primarily known as a treatment for erectile dysfunction, new research appearing in the journal Human Reproduction has found a new use for Viagra that could benefit women.
Richard S. Legro of the Penn State Hershey Obstetrics and Gynecology department and his colleagues report that the drug, also known as sildenafil citrate, could help relieve moderate to severe menstrual cramps.
According to Anna Hodgekiss of the Daily Mail, the reason for this is that it increases blood flow to the pelvic region when administered vaginally by elevating the levels of the chemical responsible for causing tissues in the body to become relaxed.
“The effects on erectile function were discovered accidentally – it was originally developed to improve blood supply to the heart in angina sufferers,” Hodgekiss said, noting that previous studies that had women take Viagra tablets for period pain relief found that the drug caused headaches. Those side effects “didn’t occur” when sildenafil was “administered as a vaginal pessary,” she added.
Legro’s team collaborated with researchers from the Nova Gradiska General Hospital in Croatia, recruiting women between the ages of 18 and 35 who suffered from moderate to severe primary dysmenorrhea (PD) – a common cause of pelvic pain in women.
Of the 29 women who were screened, all but four were randomized to receive either Viagra or a placebo. Each of them were asked to rate their pain level over a period of four consecutive hours. The investigators learned that the erectile dysfunction drug, when administered vaginally, could alleviate acute menstrual pain without side effects.
Since uterine blood flow increased as a result of both Viagra and the placebo, the reason that Viagra “alleviates pain is not yet known,” the university said. “Larger studies must be completed to validate the small sample of this study, and additional research is needed to see whether sildenafil changes the menstrual bleeding pattern.”
“If future studies confirm these findings, sildenafil may become a treatment option for patients with PD. Since PD is a condition that most women suffer from and seek treatment for at some points in their lives, the quest for new medication is justified,” added Legro, who is also a professor at the Penn State College of Medicine.
Alan R. Kunselman, also of the Penn State College of Medicine, and R. Dmitrovic of the BetaPlus Center for Reproductive Medicine in Croatia were also involved in the research. The study was funded by the US National Institutes of Health (NIH).
NASA Tech Helps California Meet Water Needs During Drought
[ Watch the Video: Airborne Snow Observatory: Measuring Snowpack from the Sky ]
redOrbit Staff & Wire Reports – Your Universe Online
It might have been the driest year ever recorded in California, but thanks to NASA, the millions of people who call the San Francisco Bay Area home had plenty of water this summer, the US space agency said Monday during a media briefing at the 2013 American Geophysical Union (AGU) Fall Meeting.
NASA’s prototype Airborne Snow Observatory mission helped water managers in the region achieve near-perfect water operations over the summer, the agency said in a statement. The high-resolution snow maps produced showing the Tuolumne River Basin in the Sierra Nevada helped officials optimize reservoir filling and hydroelectric power production at the Hetch Hetchy reservoir and the O’Shaughnessy Dam.
As a result, the reservoir was full at the end of the snowmelt season, there was no water spillage, and nearly $4 million worth of hydropower was generated, scientists from NASA, the University of Washington and McGurk Hydrologic Associates said during the media briefing. The collaboration is a three-year demonstration project between NASA’s Jet Propulsion Laboratory (JP) and the California Department of Water Resources.
“For the first time, Airborne Snow Observatory data are telling us the total water in the snowpack in the watershed and the absorption of sunlight that control its melt speed, enabling us to estimate how much water will flow out of a basin when the snow melts,” explained Tom Painter, the observatory principal investigator at JPL and an adjunct professor of geography at UCLA.
“By combining near-real-time information on the total amount of water in the snowpack with observations of water inflow to Hetch Hetchy reservoir between April and July, we were able to greatly improve the model we developed to predict inflow into the reservoir,” he noted, adding that this allowed reservoir managers to more efficiently distribute water inflow among power generation, water supply and ecological purposes.
Painter said that the higher-quality snowpack measurements and the more efficient reservoir operations will prove to be valuable assets as the community (and the rest of the world) has to deal with global warming, uncertain weather, continuing drought conditions throughout the state and an ever-growing demand for water.
The Airborne Snow Observatory is stationed on board a Twin Otter aircraft, and it measures snow depth and snow reflectivity – the two properties most essential to understanding snowmelt runoff, NASA said. By combining snow depth and estimated density, snow water equivalent (or the amount of water contained in the snow) can be derived and then used in order to determine how much water will run off.
“Snow reflectivity, or albedo, is the fraction of the incoming amount of sunlight reflected by snow. Subtracting reflected sunlight from incoming sunlight gives the absorbed sunlight, which largely controls the speed of snowmelt and timing of its runoff,” the agency explained.
Previously, Tuolumne River Basin runoff forecasts were made using monthly ground snow surveys and daily automated measurements at lower or middle elevations. However, when snow melted in those locations, an unknown amount of snow remained at higher elevations. The observatory was able to measure snow in an area 46 million times larger than that covered by the survey sites, according to the researchers.
“Snow controls high-elevation streamflow and ecosystems, but we’ve historically had to guess how much snow fell and where it was stored,” said Jessica Lundquist, an associate professor at the University of Washington. “With these data, we can improve how we model mountain systems and predictions of how those systems will change in time… To me, the Airborne Snow Observatory snow maps are cooler than pictures from Mars.”
“The Airborne Snow Observatory is an innovative use of NASA advanced sensor research applied to one of the top challenges our nation and our planet face: freshwater management and practical water management information needs,” added Brad Doorn, program manager in Applied Sciences at NASA Headquarters. “The observatory is also advancing our scientific understanding of Earth processes and how we can better monitor them in the future, from both air and space.”
China Suffers Huge Setback With Satellite Launch Failure
Lee Rannals for redOrbit.com – Your Universe Online
China’s space program was dealt another setback on Monday when it lost a $250 million Earth observation satellite just days after successfully launching its first lunar rover.
The satellite, developed by China and Brazil, was lost when a Long March 4B rocket failed to put the spacecraft into its proper orbit.
“The rocket malfunctioned during the flight, and the satellite failed to enter orbit,” sources told the state-owned Xinhua news agency.
Brazil’s National Institute for Space Research (INPE) initially issued a statement overnight that heralded a successful launch. However, news reports later surfaced from Chinese and Brazilian media reporting the rocket failed to enter orbit.
INPE said in a statement that preliminary evaluations suggest the CBERS 3 satellite returned back to Earth.
“Chinese engineers responsible for the construction of the launch vehicle are evaluating the causes of the problem,” the statement said. “The data obtained show that the subsystems of CBERS 3 functioned normally during the [launch].”
CBERS 3 was the fourth China-Brazil Earth Resources Satellite launched since 1999. The satellite featured two imaging instruments and was designed for a three-year lifespan. Its cameras were expected to collect black-and-white imagery with a resolution of about 16 feet. Sensors equipped on CBERS 3 included thermal and infrared imagers capable of distinguishing different types of vegetation and locations where water is stored and consumed.
“Brazil and China have achieved fruitful results in the past 25 years of cooperation in the (sic) space, and are confident in continuing this success,” INPE said in a statement.
China just recently launched a six-wheeled lunar rover mission last week known as Yutu. The vehicle was launched by a Long March 3B rocket. The Chinese space agency expects Yutu to land in the Moon’s northern hemisphere in mid-December.
“This will be the third robotic rover mission to land on the lunar surface, but the Chinese vehicle carries a more sophisticated payload, including ground-penetrating radar which will gather measurements of the lunar soil and crust,” BBC News Science Editor Paul Rincon said. “The 120kg (260lb) Jade Rabbit rover can climb slopes of up to 30 degrees and travel at 200m (660 ft) per hour, according to its designer the Shanghai Aerospace Systems Engineering Research Institute.”
Yutu will be landing in a flat volcanic plain that is part of a larger feature known as Mare Imbrium that forms the right eye of the “Man in the Moon.” China will be joining the US and the former Soviet Union as the only countries to compete a lunar rover mission.
Eventually, China said it hopes to launch a manned mission to the moon, as well as establish a permanent space station within the next seven years.
To Improve Foster Care, Add A Psychiatric Nurse To Treatment Team
Nurses Bring Fresh Perspective to Caring for Troubled Teens, SLU Researcher Finds
Psychiatric nurses offer a missing and critical point of view in treating adolescents in foster care who have mental health issues, an instructor at Saint Louis University School of Nursing found.
“Adding a mental health nurse to the treatment team would be ideal. He or she could bring a much-needed medical perspective to caring for teens in foster care who have psychiatric disorders. Child welfare workers and social workers don’t have the specific training they need to track health problems,” said Julie Bertram, MSN, who also is lead author of the article.
“In addition, there is a national shortage of child psychiatrists, and nurses could be a bridge to quality care because they’re able to help social workers, case managers, foster families and patients navigate the system. If we invest the time and effort into education and holistically managing the health of troubled teens, it likely would be worth it in the long-run.”
Bertram’s work, which appeared in the December issue of the Archives of Psychiatric Nursing, described her role on a treatment team that studied foster youth and chronicled reactions to her involvement from case workers and teens. J. Curtis McMillen, Ph.D., professor at the University of Chicago School of Social Service Administration, was the principal investigator of the study and hired Bertram as a nurse consultant.
Mental illness is a major problem for children in foster care. Three-quarters have suffered serious traumas such as sexual abuse or mistreatment. Typically one or both parents have histories of mental illness and substance abuse.
Not surprisingly, teens in foster care receive mental health services at a very high rate. Their use of medications for psychiatric problems is up to five times the rate of young people who are not in foster care, and many take multiple different drugs. They frequently change psychiatrists as they move between homes, some accumulating conflicting diagnoses and medicines. They leave foster care, ill-equipped to survive in the adult world, without fully understanding their mental health issues and treatment options, Bertram says.
“High rates of psychotropic medication use, polypharmacy and problems in continuity of care have raised alarms about whether youth in the foster care system may be receiving inappropriate treatments for the symptoms they present,” Bertram says. “The strikingly high rate of medication use may be just the tip of the iceberg when it comes to concerns about quality of care within the child welfare system.”
For the qualitative research study, Bertram served on the care team of eight adolescents with histories of hospitalization for psychiatric problems. Considered high needs youth, these teens took multiple psychotropic medications and lived and attended school in residential care foster facilities, which essentially are locked group homes, when the study began. They were transitioned out of the residential settings to live with foster families who received special training.
Bertram served as the nurse consultant on the treatment team – synthesizing medical information, sharing pertinent details at weekly team meetings and intervening in times of crisis and when routine medical questions arose. She also met with teens and their foster parents, establishing a rapport and trust and teaching them how to have a voice in decisions about their medical care.
Bertram began her work by completing a comprehensive psychiatric assessment. She reviewed past medical records and charts and interviewed case managers, foster parents, psychiatrists and teens to clarify mental health issues, such as diagnoses and what medications were prescribed. She then organized the medical profile to create a comprehensive mental health summary.
Bertram said the teens each had taken an average of 13 psychotropic medications – some as many as 21 — and had an average of eight different diagnoses for psychiatric problems.
“I was able to purge old, outdated or inaccurate diagnoses across cases, which typically reduced the number of diagnoses to an average of two problems,” Bertram said. “I also reviewed the medication profiles and recommended changes for two teens – one based on side effects and another because the teen was taking greater than maximum recommended doses of medications and multiple medicines without clear benefits.”
Case workers described the management of medical information as “ridiculously difficult” and the newly reorganized report as very helpful. One noted that when her patient was hospitalized, she shared the comprehensive document with hospital nurses and doctors so they could deliver better care.
The teens said that the reports answered questions they had about their diagnoses and gave them ammunition to advocate for their own health needs.
“Psychiatric nurses bring a medical and holistic perspective to the treatment team, could navigate medical and co-occurring illnesses, and could support case managers and social workers in the system,” Bertram says.
“Psychiatric nurses don’t replace the role of social workers or case managers, but complement the treatment team by contributing knowledge of mental health nursing. The needs for the knowledge, skills and understanding of a psychiatric nurse were substantial, and our findings suggest employing psychiatric and mental health nurses is one way to improve fragmented care within the foster care system.”
The research was funded by the National Institute of Mental Health.
Founded in 1928, Saint Louis University School of Nursing has achieved a national reputation for its innovative and pioneering programs. Offering bachelor’s, master’s, and doctoral nursing programs, its faculty members are nationally recognized for their teaching, research and clinical expertise.
—
On the Net:
CU Researchers May Have Discovered A Plan To Disable Meniere’s Disease
Researchers at University of Colorado School of Medicine may have figured out what causes Meniere’s disease and how to attack it. According to Carol Foster, MD, from the department of otolaryngology and Robert Breeze, MD, a neurosurgeon, there is a strong association between Meniere’s disease and conditions involving temporary low blood flow in the brain such as migraine headaches.
Meniere’s affects approximately 3 to 5 million people in the United States. It is a disabling disorder resulting in repeated violent attacks of dizziness, ringing in the ear and hearing loss that can last for hours and can ultimately cause permanent deafness in the affected ear. Up until now, the cause of the attacks has been unknown, with no theory fully explaining the many symptoms and signs of the disorder.
“If our hypothesis is confirmed, treatment of vascular risk factors may allow control of symptoms and result in a decreased need for surgeries that destroy the balance function in order to control the spell” said Foster. “If attacks are controlled, the previously inevitable progression to severe hearing loss may be preventable in some cases.”
Foster explains that these attacks can be caused by a combination of two factors: 1) a malformation of the inner ear, endolymphatic hydrops (the inner ear dilated with fluid) and 2) risk factors for vascular disease in the brain, such as migraine, sleep apnea, smoking and atherosclerosis.
The researchers propose that a fluid buildup in part of the inner ear, which is strongly associated with Meniere attacks, indicates the presence of a pressure-regulation problem that acts to cause mild, intermittent decreases of blood flow within the ear. When this is combined with vascular diseases that also lower blood flow to the brain and ear, sudden loss of blood flow similar to transient ischemic attacks (or mini strokes) in the brain can be generated in the inner ear sensory tissues. In young people who have hydrops without vascular disorders, no attacks occur because blood flow continues in spite of these fluctuations. However, in people with vascular diseases, these fluctuations are sufficient to rob the ear of blood flow and the nutrients the blood provides. When the tissues that sense hearing and motion are starved of blood, they stop sending signals to the brain, which sets off the vertigo, tinnitus and hearing loss in the disorder.
Restoration of blood flow does not resolve the problem. Scientists believe it triggers a damaging after-effect called the ischemia-reperfusion pathway in the excitable tissues of the ear that silences the ear for several hours, resulting in the prolonged severe vertigo and hearing loss that is characteristic of the disorder. Although most of the tissues recover, each spell results in small areas of damage that over time results in permanent loss of both hearing and balance function in the ear.
Since the first linkage of endolymphatic hydrops and Meniere’s disease in 1938, a variety of mechanisms have been proposed to explain the attacks and the progressive deafness, but no answer has explained all aspects of the disorder, and no treatment based on these theories has proven capable of controlling the progression of the disease. This new theory, if proven, would provide many new avenues of treatment for this previously poorly-controlled disorder.
—
On the Net:
Genetic Link Associated With One Percent Of All Cancerous Tumors Discovered
[ Watch the Video: CUX1: Cancer’s Common Ground ]
redOrbit Staff & Wire Reports – Your Universe Online
Scientists have located a single gene that is at least partially responsible for the development of one percent of all cancerous tumors, according to new research appearing in the advanced online edition of the journal Nature Genetics.
The gene in question is CUX1, and according to the study authors, this is the first time it has been broadly associated with the onset of cancer. The researchers found that when CUX1 is deactivated, a biological pathway that increases tumor growth becomes activated. Drugs inhibiting this biological pathway are currently being developed and could offer a new treatment for patients that have this particular cancer-causing mutation.
“Our research is a prime example of how understanding the genetic code of cancers can drive the search for targeted cancer therapies that work more effectively and efficiently,” lead author Dr. David Adams of the Wellcome Trust Sanger Institute said in a statement Sunday. “This could improve the lives of thousands of people suffering from cancer.”
Dr. Adams and his colleagues used genetic information from more than 7,600 patients whose DNA was collected and sequenced by groups such as the International Cancer Genome Consortium (ICGC). They discovered that mutations deactivated CUX1 in approximately one percent of the cancer genomes they analyzed.
These types of mutations are associated with tumor growth, and while the study authors said that they occur at relatively low frequency, they have been observed across several different types of cancer. Previous research focusing on genetic mutations looked at those occurring at high frequency unique to specific types of cancer, thus missing the rarer but more widespread CUX1 as a potential cancer catalyst.
“Our work harnesses the power of combining large-scale cancer genomics with experimental genetics,” explained first author Dr. Chi Wong, also of the Wellcome Trust Sanger Institute and a hematologist at Addenbrooke’s Hospital in Cambridge. “CUX1 defects are particularly common in myeloid blood cancers, either through mutation or acquired loss of chromosome 7q. As these patients have a dismal prognosis currently, novel targeted therapies are urgently needed.”
“Data collected from large consortia such the ICGC, provides us with a new and broader way to identify genes that can underlie the development of cancers,” added Professor David Tuveson from Cold Spring Harbor Laboratory. “We can now look at cancers as groups of diseases according to their tissues of origin and collectively examine and compare their genomes.”
In order to determine who deactivating CUX1 could lead to tumor development, they inhibited it in cultured cells and found that it had a “knock-on effect” on the biological inhibitor PIK3IP1. With its effects hampered, phosphoinositide 3-kinase (PI3K) – an enzyme responsible for cell growth – becomes active and increased the rate of tumor progression, the investigators said.
“Drugs that inhibit PI3K signaling are currently undergoing clinical trial,” said Professor Paul Workman, Deputy Chief Executive and Head of Cancer Therapeutics at The Institute of Cancer Research. “This discovery will help us to target these drugs to a new group of patients who will benefit from them and could have a dramatic effect on the lives of many cancer sufferers.”
FDA Approves New Treatment For Chronic Hepatitis C
redOrbit Staff & Wire Reports – Your Universe Online
Sovaldi (sofosbuvir), a new drug that can treat hepatitis C infections, has been approved for sale by the US Food and Drug Administration (FDA), agency officials confirmed on Friday.
According to Kim Painter of USA Today, the medication is manufactured by Gilead Sciences and “can be paired with other drugs to make treatment of the liver-damaging disease faster, easier and more effective.”
Dr. Edward Cox, director of the Office of Antimicrobial Products in the FDA’s Center for Drug Evaluation and Research, called the approval “a significant shift in the treatment paradigm for some patients with chronic hepatitis C.” The drug is the second to get the go-ahead from the FDA to treat the condition in less than a month, following the November 22 approval of Olysio (simeprevir).
The drug, which will be available in pill form, will allow some of the over three million Americans to be treated without co-administration of interferon, which can have flu-like side effects, the Wall Street Journal added. In clinical trials, Sovaldi was found to cure hepatitis C in 90 percent of patients when used in tandem with one or more other drugs, which varied based on the specific type of the disease those individuals had contracted.
However, “the greater convenience and effectiveness comes at a price,” New York Times reporter Andrew Pollack wrote. “Gilead said the wholesale cost of Sovaldi… would be $28,000 for four weeks – or $1,000 per daily pill. That translates to $84,000 for the 12 weeks of treatment recommended for most patients, and $168,000 for the 24 weeks needed for a hard-to-treat strain of the virus.”
Michael Weinstein, president of the AIDS Healthcare Foundation, told Pollack the cost was “completely unjustified,” but Gilead countered that the price was fair in light of the medication’s high cure rate. Furthermore, the pharmaceutical company said the overall cost for the three-month regimen was “consistent with, and in some cases lower than” the cost of other hepatitis C treatments, and that financial assistance would be available to some patients.
Sovaldi is described by the FDA as a nucleotide analog inhibitor, which blocks a protein required for the hepatitis C virus to replicate. It is to be used as one part of a combination treatment for chronic infection, and is to be taken in conjunction with peginterferon-alfa and/or ribavirin. The six clinical trials used to evaluate it involved nearly 2,000 patients who had either not received treatment or had not responded to previous treatment attempts.
Dr. John Ward, director of the Division of Viral Hepatitis at the US Centers for Disease Control and Prevention (CDC), told CNN that this is a “landmark advance in the treatment of hepatitis C, opening up new opportunities to stop the spread of this virus and the ravages of this disease.” However, he also emphasized these treatments are only effective if they can get “more people screened and into care. Right now, most Americans with hepatitis C don’t access treatment because they have no idea they’re infected.”
Mindfulness Meditation Found To Result In Gene Expression Changes
redOrbit Staff & Wire Reports – Your Universe Online
While there have been multiple scientific studies providing evidence that meditation can have a positive influence on a person’s health, new research appearing in the journal Psychoneuroendocrinology suggests there may be an actual biological trigger for these therapeutic effects.
Researchers from the University of Wisconsin-Madison, along with colleagues from France and Spain, report they have discovered the first evidence of specific molecular changes that occur in a person’s body following a session of mindfulness meditation.
They compared the effects that an eight-hour mindfulness session had on a group of people experienced with meditation to a control group who participated in quiet, non-meditative activities. They found an array of “genetic and molecular differences, including altered levels of gene-regulating machinery and reduced levels of pro-inflammatory genes, which in turn correlated with faster physical recovery from a stressful situation” in the meditators.
“To the best of our knowledge, this is the first paper that shows rapid alterations in gene expression within subjects associated with mindfulness meditation practice,” study author Richard J. Davidson, founder of the Center for Investigating Healthy Minds and a professor of psychology and psychiatry at the university, said in a statement Wednesday.
Perla Kaliman, first author of the article and a researcher at the Institute of Biomedical Research of Barcelona, added the specific changes they observed are actually targets of anti-inflammatory and analgesic drugs. Mindfulness-related activities are endorsed by the American Heart Association and have previously been proven effective against inflammatory disorders.
“The results show a down-regulation of genes that have been implicated in inflammation. The affected genes include the pro-inflammatory genes RIPK2 and COX2 as well as several histone deacetylase (HDAC) genes, which regulate the activity of other genes epigenetically by removing a type of chemical tag,” the university said. “What’s more, the extent to which some of those genes were downregulated was associated with faster cortisol recovery to a social stress test… performed in front of an audience and video camera.”
The study authors said there was no observed difference in tested genes between the two groups at the beginning of the study – the effects were only noticed after mindfulness practice. Furthermore, there were no differences in multiple other genes that modify DNA, leading the investigators to believe meditation specifically affected certain regulatory pathways.
The university emphasized the research “was not designed to distinguish any effects of long-term meditation training from those of a single day of practice. Instead, the key result is that meditators experienced genetic changes following mindfulness practice that were not seen in the non-meditating group after other quiet activities – an outcome providing proof of principle that mindfulness practice can lead to epigenetic alterations of the genome.”
New Database Sheds Light On Gene Patenting Process
redOrbit Staff & Wire Reports – Your Universe Online
Researchers from Cambria, a non-profit biotech research organization, and the Queensland University of Technology (QUT), have developed a new online database revealing which organizations have applied for patents related to genes and proteins in living organisms.
The open-sourced PatSeq registry, which is the subject of a paper published in last week’s edition of Nature Biotechnology, currently holds over 120 million DNA sequences and 10 million protein sequences extracted from global patents documents.
The free service, which was unveiled last week, allows anyone to explore who has sought patents for this type of genetic information, the institutions behind the Internet resource said in a statement Friday. It also includes a graphical took to visualize the scope of patents overlaying the human genome and a search tool allowing people with gene or protein sequences to find matches contained in the PatSeq database.
“Apparently, many patent offices have no way of tracking genetic sequences disclosed in patents and currently do not provide them in machine-searchable format,” explained principal author Professor Osmat Jefferson of the QUT Science and Engineering Faculty. “This likely means patents are being granted for genes that are not ‘newly discovered’ at all, because the patent offices have no way of really knowing.
“Gene patenting is an area where almost everyone has an opinion – passions run high but until now the evidence has been lacking,” she added. “What is happening? Who’s doing the patenting? Why are they doing it? How much are they doing it? What rights are being granted? And how much is our society benefiting from these biological patent teachings? No one really knows because the whole system is opaque.”
According to Jefferson, the scrutiny surrounding genes and protein patenting dates back to June, when the US Supreme Court ruled naturally occurring genetic material could not be patented. That stemmed from a 2009 lawsuit by the American Civil Liberties Union (ACLU) and the Public Patent Foundation involving patent rights for a pair of genes linked to breast and ovarian cancer, BRCA1 and BRCA2.
Those patents were held by Myriad Genetics, and were among the approximately 4,000 human genes patented to companies, universities and institutions that discovered and/or decoded them. However, the ACLU and the Public Patent Foundation argued since DNA is a natural product, it could not be patented under the US Patent Act. The Supreme Court unanimously agreed with them, declaring biotech companies “should not have exclusive control over genetic information found inside the human body,” the Associated Press (AP) reported.
“A patent is a government grant of a limited exclusive right to try to stop others’ use of an invention,” Jefferson said. “When that invention is a gene or protein sequence present in a living organism, it raises serious social questions that deserve serious consideration, especially when that sequence is used for an important genetic test or diagnostic.”
With the new database, “all findings can be embedded and shared with anyone, anywhere at no cost, allowing researchers, policy makers and concerned citizens to explore the evidence underlying this practice,” she added. “The public – and indeed enterprise and policy makers – need to know the answers if we’re to have a transparent, fair and economically productive society.”
Corn Oil Reduces Cholesterol Better Than Extra Virgin Olive Oil
April Flowers for redOrbit.com – Your Universe Online
A new study reveals that corn oil significantly reduces cholesterol with more favorable changes in total cholesterol (TC) and LDL-C than extra virgin olive oil.
Dr. Kevin C. Maki, of Biofortis, the clinical research arm of Mérieux NutriSciences, presented the study findings at the American Society for Nutrition’s Advances & Controversies in Clinical Nutrition Conference earlier this week.
The study was comprised of 54 healthy men and women. Among the participants, consumption of foods made with corn oil resulted in significantly lower levels of LDL (bad) cholesterol and total cholesterol than the same foods made with extra virgin olive oil. LDL cholesterol was lowered by 10.9 percent with corn oil compared to extra virgin olive oil’s 3.5 percent reduction. Corn oil reduced total cholesterol by 8.2 percent compared to 1.8 percent for extra virgin olive oil.
Each day, the study participants received four tablespoons of either corn oil or extra virgin olive oil in the foods provided as part of a weight maintenance diet—consistent with the Dietary Guidelines for Americans recommendations.
The clinical trial was a randomized, double-blind, controlled crossover assessment of the effects of dietary oils on fasting lipoprotein lipids that compared the effects of corn and extra virgin olive oil on LDL cholesterol (primary outcome variable), total cholesterol, HDL cholesterol (good cholesterol), Non-HDL cholesterol, Triglycerides and the total to HDL cholesterol ratio.
The participants all had fasting LDL cholesterol ≥130 mg/dL and <200 mg/dL. Other measurements, such as fasting blood sugar, were taken from all participants during visits to the clinical study center before and after each treatment phase of the study.
“The study results suggest corn oil has significantly greater effects on blood cholesterol levels than extra virgin olive oil, due, in part, to the natural cholesterol-blocking ability of plant sterols,” said Dr. Maki. “These findings add to those from prior research supporting corn oil’s positive heart health benefits.”
In the US, cardiovascular disease remains the number one cause of death. Previous studies support the idea that diets containing at least 5-10 percent of calories from polyunsaturated fatty acids (PUFAs) from vegetable oils, are associated with lower risk for heart disease.
Research suggests that the unique combination of healthy fatty acids and plant sterols in corn oil help lower cholesterol. Corn oil has four times the plant sterols of olive oil and 40 percent more than canola oil. A 2013 USDA comparison of other cooking oils and an analysis of corn oil showed that corn oil has a plant sterols content of 135.6 mg/serving vs. 30.0 mg/serving for olive oil. Plant sterols are naturally occurring substances in fruits, vegetables, nuts, seeds, cereals, legumes and vegetable oils, such as corn oil, which could have an important role in a heart healthy diet.
Study: Cell Phone Use Leads To Lower GPA, More Stress And Anxiety
[ Watch the Video: Put Down The Cell Phone For Better Learning ]
Brett Smith for redOrbit.com – Your Universe Online
Today’s college students are almost literally attached at the hip to their cell phones and researchers at Kent University said they wanted to see what effect this desire to stay connected to friends, family and the Internet had on students’ academic performance, stress level and overall happiness.
Using a survey of 500 students and other methods, the researchers found that cell phone use appeared to have a negative effect on students’ overall wellbeing, according to their report in the journal Computers in Human Behavior.
“The students in our study who used the cell phone more had lower GPA, higher anxiety, and lower satisfaction with life relative to their peers who used the cell phone less,” lead researcher Andrew Lepp told the Milwaukee-Wisconsin Journal Sentinel via email.
The researchers also used a clinical measure of anxiety and each student’s level of satisfaction with their own life in the analysis. Participants allowed the study team to access their cumulative college grade point average (GPA) from university records. All participants were undergraduate students and were equally distributed by expected year of graduation. In all, 82 different, self-reported majors were represented.
According to the study team, many cell phone users said their experiences with their devices are stressful.
“The social network sometimes just makes me feel a little bit tied to my phone. It makes me feel like I have another obligation in my life that I have to stick to,” one survey participant told researchers. “Sometimes the cell phone just makes me feel like it is a whole new world of obligation that I have because anybody can get a hold of me any time by just thinking about me. If my mom wanted to give me a call right now and just talk for a second, she could. And if I did not call her back by the end of the day, she would get worried. It creates a bit of anxiety and it is kind of annoying sometimes.”
Some respondents said their phone made them feel obligated to stay connected to a network of peers or family.
“That obligation was perceived as stressful by many students (especially those getting 100s of texts a day),” Lepp said.
“There is no ‘me time’ or solitude left in some of these students’ lives and I think mental health requires a bit of personal alone time to reflect, look inward, process life’s events, and just recover from daily stressors,” Lepp told the Sentinel Journal. “Also, a few of the students we interviewed reported sending texts constantly throughout the day from morning to night – that in itself might be stressful.”
“Furthermore, interviews with some students suggested that communicating primarily by text message can create tension because meaning or intent is not always perfectly clear in brief, rapidly composed texts,” he added.
The new Kent study follows a similar study published by the same group earlier this year that looked at the connection between cell phone use and cardiorespiratory fitness.
Taken together, the two studies suggest students should be educated about the potential holistic impact of their cell phone use and asked to reflect on how they might curb excessive use, the researchers said.
NASA Fellowships, Scholarships Bring Diversity To Future STEM Workforce
NASA’s Minority University Research and Education Project (MUREP) has awarded fellowships and scholarships for the 2013-2014 academic year to 40 graduate and undergraduate students from across the United States to increase diversity in science, technology, engineering and math (STEM) disciplines.
Thirty graduate students from 16 states and the District of Columbia were selected to receive the competitive Harriett G. Jenkins Graduate Fellowship, which provides as much as $45,000 annually for as many as three years, and includes tuition offset, student stipend, and a research experience at a NASA center. It addresses NASA’s mission-specific workforce needs and supports the development of the future STEM workforce through the increased number of master’s and doctorate degrees awarded to women, ethnic minorities and disabled people in STEM disciplines.
Ten undergraduate students from nine states and Puerto Rico were selected to receive the competitive MUREP scholarship, which provides an academic stipend worth as much $9,000 and $6,000 more for a 10-week internship at a NASA center. These scholarships support women, ethnic minorities and disabled students pursuing STEM degrees and enables them to augment their academic learning with technical collaborations and professional development.
NASA’s Office of Education is strengthening involvement with higher education institutions to ensure that NASA can meet future workforce needs. Currently, minorities make up a disproportionately small percentage of graduates entering STEM fields. MUREP strives to ensure that underrepresented and underserved students participate in NASA education and research projects, which stimulates increasing numbers of them to continue their higher education and earn advanced degrees.
NASA’s Ames Research Center in Moffett Field, Calif., manages MUREP activities for the Office of Education.
For more information about the Harriett G. Jenkins Graduate Fellowship and the MUREP Scholarship to see a complete list of the 2013-2014 awardees, visit: http://www.nasa.gov/education/MUREP_2013_Awardees
For more information about NASA’s education programs, visit: http://www.nasa.gov/education
—
On the Net:
Plastic Surgeon Claims Smoking Marijuana Can Cause Man Boobs
Lee Rannals for redOrbit.com – Your Universe Online
The pros and cons of marijuana use have been flying around the medical community for decades. While the debate on whether the good outweighs the bad is far from over, one plastic surgeon’s claim may help some pot smokers put down that joint once and for all.
Dr. Anthony Youn, a Detroit-based plastic surgeon has written a special report for CNN on how smoking pot regularly could be causing men to grow boobs, a condition known as gynecomastia.
Youn, who is known for his memoir “In Stiches,” says animal studies have shown the active ingredient in marijuana can result in a decrease in testosterone levels, a reduction of testicular size, and abnormalities in the form and function of sperm. However, in humans the effects of marijuana could also lead to man boobs, Youn says.
“So can smoking pot really give you man boobs?” Youn asked in the CNN report. “Probably. Although the association between marijuana and gynecomastia hasn’t been conclusively proven, it appears very plausible.”
Gynecomastia is a hormone imbalance between testosterone and estrogen, and when the ratio between these tips in favor of estrogen for men, the body responds by creating breast tissue. Youn said plastic surgeons routinely inquire of their gynecomastia patients about their marijuana use and recommend they stop smoking pot to help fix the condition.
According to the American Society for Aesthetic Plastic Surgery (ASAPS), gynecomastia is the fifth-most common cosmetic surgery in men. The condition affects about 33 percent to 41 percent of men between the ages 25 and 45, and 60 percent of 14-year-old boys.
Dr. Jeffrey Donaldson, a board certified plastic surgeon in Columbus, Ohio, said many men do not even realize plastic surgery is an option to get rid of gynecomastia.
“A relatively simple procedure with hidden incisions can establish a more masculine physique. This helps men regain confidence on the beach, at the gym and in intimate situations,” Donaldson said in a statement in March, 2011.
During the surgery, a tiny incision is made within the hair of each armpit, where the doctor removes excess fat and skin with advanced liposuction techniques. Donaldson said downtime and recovery is minimal, and there is seldom much pain.
Youn pointed out with the laws changing on cannabis use in states like Colorado and Washington, man boobs could become a bigger problem.
“The legalization of marijuana in some states could make it easier for researchers to determine the exact effects of cannabis use on hormone levels, gynecomastia and other bodily functions,” Youn wrote in the report. “If a true link between smoking pot and gynecomastia does exist, then we should expect to see a spike in gynecomastia treatments in those states which have legalized marijuana.”
The plastic surgeon pointed out the number of men undergoing surgery for gynecomastia nationwide rose nearly 30 percent from 2011 to 2012.
Ancient Headless Remains Offer Clues To Dietary Structure Of Vikings
Lawrence LeBlond for redOrbit.com – Your Universe Online
It has long been known that ancient Vikings buried dead slaves with their masters, but new isotopic research of ancient skeletal remains is providing at least one researcher with more evidence of how these people lived their lives – more notably what their diets were like.
Elise Naumann, a PhD candidate in archeology at the University of Oslo in Norway, has made several remarkable discoveries using the skeletons that were exhumed at Flakstad in Lofoten. Her research is based on a total of ten individuals, of which at least three were found in double and triple graves and were headless. Her findings are published in the January 2014 issue of the Journal of Archaeological Science.
The isotope analyses, combined with analyses of ancient DNA, gave suggestive evidence that the headless skeletons were slaves who were decapitated before being buried with their masters. This discovery says a lot about the great differences between people in the society of the time. “Life was undoubtedly difficult and brutal for the majority of people. Only a very few were privileged,” wrote Mari Kildahl, a journalist with the University of Oslo.
Naumann noted, however, that there is nothing new about the fact that slaves during the Viking era were buried with their masters, often bound hand and foot and beheaded before burial.
What is new, Naumann explains, is how the analytical methods used and their results have offered fresh insights into the society and people of the past. The isotopic analyses have given researchers new information about the diet and health of these people who lived more a thousand years ago. Analyses of the ancient DNA also yield knowledge about genealogy and genetics.
DIETARY DIFFERENCES
Along with the 10 individuals in the latest discovery, Naumann has also investigated the skeletons of at least 46 other individuals buried in single, double and triple graves in the region. Most of these remains were discovered in the three northernmost Norwegian counties and date from between 400 and 1050 AD. Much of the skeletal material for Naumann’s research was borrowed from the Schreiner Collection at the University of Oslo.
Naumann noted that her work on the diet and social structure of these ancient people is fairly novel, noting that hardly any isotopic research has been conducted on skeletal material from this time period.
She found, through the isotopic analyses that sites with double and triple burials included people who did not share a direct bloodline, and diets in each individual were found to be vastly different. In the headless remains, the analyses uncovered diets of mostly fish, synonymous with poor individuals of the period. The other remains showed evidence of diets rich in meat and other land-based foods – foods generally associated with royalty of the time.
The findings suggest that people of rank ate more meat and other animals products than the poor did. The differing diets reflected difference in social status and different lives. Even in a small place like Flakstad, large variations in diet were seen during the time of the Vikings. These variations were seen even within the same household, with large variations seen between men and women, and adults and children.
Naumann explains that this is typical because men traveled more than women. And even though there is little known about the scope of the practice, experts suggest that it was common for parents to put their children in foster care, often with people of lower social status, such as with slaves or servants. This would likely affect the diet of the child and may explain why some of the isotopic analyses show that many people had different diets as adults than they had as children.
Naumann’s research suggests that distribution of food was a significant structuring factor for society during the Iron Age in Norway. She plans to continue her work to learn more about an individual’s life cycle during the Viking era.
“It is possible to use isotope analyses to learn about an individual’s life cycle – about their journey through life. This is something I want to study further. I see a great potential in more advanced use of isotope analysis,” Naumann concluded.
Fibromyalgia Might Be Harder on Younger Patients
Every year, Fibromyalgia affects millions of people from all over the world.
This mysterious syndrome shows a very wide variety of symptoms, most of which are common to other medical conditions as well.
This is precisely what makes Fibromyalgia so mysterious, so frequently misdiagnosed and even more frequently misunderstood.
The main symptom shown by Fibromyalgia is chronic pain, but since this is something that can appear in the case of other conditions too, the medical professional analyzing a patient will have to take into consideration multiple factors. The first “red flag” is the development of the so-called “painful tender points” around the body.
More precisely, if more than 11 (out of 18) tender points in the body of the patient are painful, then the patient is likely to have developed Fibromyalgia. Furthermore, if the chronic pain he/she has experienced has lasted for more than 3 months, the chances of having developed this syndrome are increased.
In addition to pain though, there may be a lot of other symptoms, such as those typical for chronic depression, those typical for the irritable bowel syndrome, restless sleep and insomnia, dizziness, headaches, impairment of the cognitive functions, and so on. Every patient is different and two patients are very likely to develop completely different symptoms.
Most often, it will be women who are affected by this syndrome, but this is not a gender-exclusive medical condition, as men can be affected by it as well. Furthermore, people from all age groups can be affected, from children to the elderly.
Recent studies have shown that the younger people are more affected by Fibromyalgia than the other age groups. Even if the symptoms may appear in older people, the younger ones will perceive them as stronger.
The same studies show that the young people which have been surveyed are smokers, but with a lower BMI (body-mass-index). This recent discovery comes as a surprise, especially since older patients tend to have a poorer physical condition and the Fibromyalgia symptoms could have affected them more because of that.
Physical condition and Fibromyalgia are strongly connected to each other, since it is often caused by poor quality of the dietary intake and by lack of exercise.
The new discovery is even more surprising if you take into consideration the fact that those affected by Fibromyalgia the most are also those who show, at least apparently, the least important issues when it comes to their nutrition (as the BMI can be an important trigger in this matter).
Furthermore, the same study links Fibromyalgia and a history of abuse, something that has been thought of for a long while. Because this syndrome is also linked with chronic depression, this discovery could reinforce the idea that the two of them, Fibromyalgia and chronic depression, come together in the large majority of time.
The same discoveries show, on the other hand, that the younger group of age (people who are up to 39 of age), suffered from the symptoms of Fibromyalgia for a shorter amount of time than those who belonged to other groups of ages.
The fact that new discoveries are made regularly in the field of Fibromyalgia should be good news, especially since this syndrome is still so poorly understood, even in the medical circles. Up to the moment, researchers and medical professionals haven’t yet reached a consensus when it comes to the causes that lead to the development of Fibromyalgia.
On the one hand, there are those who sustain the fact that this syndrome is caused by the way the nerves in the human body perceive the pain. On the other hand though, there are multiple other branches and suspicions in this area.
Knowing exactly what is it that can cause the development of Fibromyalgia is important because only then will scientists be able to create a completely adequate treatment.
Currently, patients suffering from this syndrome are treated with a combination of drugs (pain killers, and sometimes sleeping pill and antidepressants) and alternative remedies (they are suggested to eat healthier, to take certain supplements, to regularly exercise as low impact as they feel the need to, to try out Tai-Chi, Yoga, Acupuncture and other oriental “cures”, and so on).
If you believe that you are showing symptoms of Fibromyalgia, then it is very important to check out with a specialist. Only a medical professional will be able to run tests and inquire the right questions as to find out what your real diagnosis may be.
50 Years After Measles Vaccination Was Developed, US Threat Still High
Lawrence LeBlond for redOrbit.com – Your Universe Online
In the 1950s and early 60s, Dr. John Franklin Enders, who was known as “The Father of Modern vaccines,” with the aid of Dr. Thomas C. Peebles, then of the Children’s Hospital Boston, and Samuel L. Katz, MD, professor emeritus of Duke University, worked to develop a new vaccine that would rescue America from one of the most contagious diseases in the world: measles.
Now, fifty years after the approval of the extremely effective measles vaccine, the disease still poses a threat to domestic and global health security. On average, 430 children die each day due to measles infection around the world. In 2011, there were about 158,000 deaths attributed to measles.
In the US, measles was largely eliminated in 2000, and a new study published in the journal JAMA Pediatrics, shows that measles elimination in the US was sustained through at least 2011; elimination is defined as absence of continuous disease transmission for longer than 12 months.
Mark J. Papania, MD, MPH, of the US Centers for Disease Control and Prevention (CDC), and colleagues, warn, however, that importation continues on an international scale and that American doctors should suspect measles in children with high fever and rash, especially after international travel or contact with foreign visitors, and should report suspected cases immediately to local health departments.
Before the US had an effective vaccine for measles, the highly-infectious disease was a year-round threat. Prior to 1963, nearly every child had become infected with measles in the US; as many as 48,000 people were hospitalized each year, 7,000 had seizures, 1,000 suffered permanent brain damage or deafness and about 500 had died.
Today, people who are infected with measles continue to cause outbreaks in areas with unvaccinated people, which often includes young children. While being largely eradicated in the US, it is still a serious illness worldwide, with one in five children being hospitalized. In the US, healthcare professionals report about 60 cases per year. However, the 2013 measles outbreak has been the worst in years – 175 cases at last count – nearly all linked to foreign travel.
“A measles outbreak anywhere is a risk everywhere,” said CDC Director Tom Frieden, MD, MPH. “The steady arrival of measles in the United States is a constant reminder that deadly diseases are testing our health security every day. Someday, it won’t be only measles at the international arrival gate; so, detecting diseases before they arrive is a wise investment in U.S. health security.”
Health experts note that eliminating measles on a global scale has benefits that reach far beyond the number of lives saved each year. Stopping measles in its tracks can also help researchers stop other diseases from running rampant through societies. The CDC and its partners have been building a global health security network that can easily be scaled up to deal with multiple emerging threats.
Thanks to global outreach, one in five countries can now rapidly detect, respond to, or prevent global health threats caused by emerging diseases. By improving disease response overseas, which includes strengthening surveillance and lab systems, training healthcare workers and building facilities to investigate outbreaks, we can make the world – and the United States – a safer, more secure place.
“There may be a misconception that infectious diseases are over in the industrialized world. But in fact, infectious diseases continue to be, and will always be, with us. Global health and protecting our country go hand in hand,” Dr. Frieden said.
There are at least five sources that are a threat to today’s health security. These include emergence and spread of new microbes; globalization of travel and food supply; the rise of drug-resistant pathogens; acceleration of biological science capabilities and the risk that these capabilities may lead to inadvertent or intentional release of pathogens; and concerns of terrorist acquisition, development and use of biological agents.
“With patterns of global travel and trade, disease can spread nearly anywhere within 24 hours,” Dr. Frieden said in a statement. “That’s why the ability to detect, fight, and prevent these diseases must be developed and strengthened overseas, and not just here in the United States.”
Katz, who played a major role in the development of the measles vaccine in 1963, is being honored by the CDC 50 years after the historic achievement. A ceremony to celebrate Katz’ work on the fight against measles is ongoing and global health leaders are taking this time to highlight the domestic importance of global health security, how far we have come to reduce the burden of measles and the prospects of eradicating the disease worldwide.
While measles can be eliminated, the fact that it is so contagious means that complete vaccination is required to stop it in its tracks and prevent sustained outbreaks. Such strides have already been implemented — the CDC has been part of a global campaign to vaccinate the world and since 2001, more than a billion children have received vaccination from measles. Over the past decade, these vaccinations have averted some 10 million deaths – a fifth of all deaths prevented by modern medicine.
“The challenge is not whether we shall see a world without measles, but when,” Dr. Katz said.
“No vaccine is the work of a single person, but no single person had more to do with the creation of the measles vaccine than Dr. Katz,” said Alan Hinman, MD, MPH, Director for Programs, Center for Vaccine Equity, Task Force for Global Health. “Although the measles virus had been isolated by others, it was Dr. Katz’s painstaking work passing the virus from one culture to another that finally resulted in a safe form of the virus that could be used as a vaccine.”
While the work to eradicate measles on a global scale falls in the hands of health experts, such as those with the CDC, anyone and everyone can join the fight to end measles transmission. People can start by visiting their healthcare provider and making sure they are up to date on all their vaccinations, including for measles.
Possible Link Between Autistic Behavior, Gut Bacteria Discovered
redOrbit Staff & Wire Reports – Your Universe Online
Scientists have discovered a possible link between the symptoms of autism spectrum disorders (ASD) and changes occurring in the gut bacteria of mice, according to new research appearing in Thursday’s edition of the journal Cell.
While autism is a neurodevelopmental condition typically diagnosed when people demonstrate specific behaviors such as decreased social interaction and impaired communication skills, those who are diagnosed with these disorders also often suffer from abdominal cramps and other gastrointestinal issues, the researchers said.
Using the apparent link between the gut and brain issues in ASD patients as their guide, the study authors discovered that changes in gut bacteria could influence autism-like behaviors in mice. Furthermore, after the rodents were treated with bacteria from a healthy gut, many of their behavioral abnormalities (including anxiety-like behaviors) went away. Their findings suggest that probiotics could be used to treat at least some ASD symptoms.
“Several studies have shown that the microbiota can influence a variety of behaviors, from anxiety and pain to social and emotional behavior,” Elaine Hsiao of the California Institute of Technology (Caltech) explained in a statement. “Our work is the first to demonstrate that modulating the microbiota can influence autism-related behaviors in the context of a disease model.”
“Traditional research has studied autism as a genetic disorder and a disorder of the brain, but our work shows that gut bacteria may contribute to ASD-like symptoms in ways that were previously unappreciated,” added Caltech biology professor Sarkis K. Mazmanian. “Gut physiology appears to have effects on what are currently presumed to be brain functions.”
Hsiao, Mazmanian and their colleagues studied the possible connection between gut bacteria and the brain by using a mouse model of autism, which simulated a severe viral infection known to increase the risk that a pregnant woman will give birth to an autistic child. They said that the “autistic” offspring of similarly infected pregnant mice also possessed abnormalities in their gastrointestinal systems.
Specifically, the GI tracts of these pseudo-autistic mice were “leaky,” meaning that material could pass through the intestinal wall and into the bloodstream, the researchers said. In order to determine whether or not the autism-like behaviors were influenced by the GI symptoms, the investigators treated the rodents with an experimental form of probiotic therapy involving Bacteroides fragilis.
The treatment helped fix the “leaky” gut, and after observing the treated mice, the study authors also found that their behavior had changed. For example, they were more likely to interact with other mice, tended to be less anxious, and were also less likely to engage in a repetitive digging behavior.
“The B. fragilis treatment alleviates GI problems in the mouse model and also improves some of the main behavioral symptoms. This suggests that GI problems could contribute to particular symptoms in neurodevelopmental disorders,” Hsiao said. The research team now hopes to test the probiotic treatment on the behavioral symptoms of human autism within the next two years.
“This probiotic treatment is postnatal, which means that the mother has already experienced the immune challenge, and, as a result, the growing fetuses have already started down a different developmental path,” added Patterson. “In this study, we can provide a treatment after the offspring have been born that can help improve certain behaviors. I think that’s a powerful part of the story.”
Breakthrough Study Shows How Mosquitoes Smell Us, May Lead To Better Repellants
Brett Smith for redOrbit.com – Your Universe Online
Many a pleasant summer’s evening has been ruined by the onset of swarms of mosquitoes, and studies have shown that the flying pests are drawn to both the carbon dioxide we exhale and the scent of our skin.
A new study, published in the journal Cell, has found that mosquitoes actually use the same olfactory mechanism to detect both carbon dioxide and skin odors.
“It was a real surprise when we found that the mosquito’s CO2 receptor neuron, designated cpA, is an extremely sensitive detector of several skin odorants as well, and is, in fact, far more sensitive to some of these odor molecules as compared to CO2,” said Anandasankar Ray, an associate professor of entomology at The University of California, Riverside. “For many years we had primarily focused on the complex antennae of mosquitoes for our search for human-skin odor receptors, and ignored the simpler maxillary palp organs.”
To discover that cpA plays a role in detecting human odor, the researchers chemically shut down the activity of the receptor neuron in Aedes aegypti, a species of mosquito known to spread dengue fever. The researchers then looked to see the mosquito’s reaction to human foot odor and found their attraction was greatly reduced, compared to a control group.
The study team also screened nearly half a million compounds to identify several that block and trigger cpA neurons. The researchers noted two chemicals in particular: ethyl pyruvate, a fruity-scented cpA blocker used as a flavor agent in food and cyclopentanone, a minty-smelling cpA trigger used as a flavor and fragrance agent. The cpA-inhibitor ethyl pyruvate was found to substantially lessen the mosquito’s attraction to a human arm. The cpA-trigger cyclopentanone, on the other hand, readily attracting mosquitoes to a trap set up by the study team.
“Such compounds can play a significant role in the control of mosquito-borne diseases and open up very realistic possibilities of developing ways to use simple, natural, affordable and pleasant odors to prevent mosquitoes from finding humans,” Ray said. “Odors that block this dual-receptor for CO2 and skin odor can be used as a way to mask us from mosquitoes. On the other hand, odors that can act as attractants can be used to lure mosquitoes away from us into traps.”
“These potentially affordable ‘mask’ and ‘pull’ strategies could be used in a complementary manner, offering an ideal solution and much needed relief to people in Africa, Asia and South America – indeed wherever mosquito-borne diseases are endemic,” Ray added. “Further, these compounds could be developed into products that protect not just one individual at a time but larger areas, and need not have to be directly applied on the skin.”
Conventional mosquito traps use carbon dioxide to attract mosquitoes, but generating the gas is expensive, cumbersome, and impractical in developing countries.
“The powerful experimental approaches we have developed will help us find potential solutions that we could use not only here in the United States but also in Africa, Asia, and South America, where affordability is key in the war against these diseases,” Ray concluded.
Crocodilians Use Lures To Catch Nesting Birds
Brett Smith for redOrbit.com – Your Universe Online
Alternately lethargic and viciously violent, crocodilians are not often thought of as cunning, duplicitous predators. However, a new study in the journal Ethology, Ecology and Evolution has found that these massive reptiles sometimes use sticks and twigs to lure in unsuspecting birds, particularly during nest-building season.
“Our research provides a surprising insight into previously unrecognized complexity of extinct reptile behavior,” suggested study author Vladimir Dinets, a research assistant professor in the Department of Psychology at the University of Tennessee. “These discoveries are interesting not just because they show how easy it is to underestimate the intelligence of even relatively familiar animals, but also because crocodilians are a sister taxon of dinosaurs and flying reptiles.”
The study focused on two crocodilian species—marsh crocodiles, or muggers, and American alligators. The research team said their study is the first to report tool use by any reptiles and the first known case of predators timing their use of lures to the seasonal activities of their prey.
Dinets said he first observed the behavior in 2007 when he saw crocodiles in shallow water near the edge of a pond in India with small twigs laying across their snouts. The sticks fooled nest-building birds into wading into the water. After lying still for hours, the crocodiles would lunge when a bird neared the stick.
Dinets told The Telegraph that the muggers living in the marshes of India were able to fool some larger birds.
“On one occasion, an intermediate egret approached one of the crocodiles and stretched its neck towards the stick,” he said. “The crocodile lunged at the bird.”
To further investigate this behavior, Dinets and his colleagues observed the reptiles for one year at four sites in Louisiana, which included two bird breeding ground sites and two non-breeding sites.
The researchers saw a significant increase in alligators prominently displaying sticks on their snouts from March to May, nest-building season. The team also saw that the reptiles in the breeding, or rookery, sites used the lures during and after the nest-building season. At non-rookery locations, the reptiles used lures only during the nest-building season.
Dinets added that juvenile crocodilians did not exhibit this same behavior. He said the narrowness of their snout could be a limiting factor.
“This study changes the way crocodiles have historically been viewed,” said Dinets. “They are typically seen as lethargic, stupid and boring but now they are known to exhibit flexible multimodal signaling, advanced parental care and highly coordinated group hunting tactics.”
While American alligators are fairly common, muggers are considered ‘vulnerable’ by the International Union for the Conservation of Nature (IUCN). The muggers’ range stretches from the Middle East to nearly China.
A freshwater species of crocodiles, muggers prefer shallow lakes, marshes or slow-moving rivers. Muggers are considered to be more mobile on land and will travel sizable distances over land in search of a better habitat. They will also chase prey on land for short distances. They have been known to dig burrows for shelters during the dry seasons in Southern Asia.
Robotics Engineers Study Cockroaches For Sensor Design
Brett Smith for redOrbit.com – Your Universe Online
Robotics engineers have been increasingly turning to nature for inspiration recently, and a team from the University of California, Berkeley and Johns Hopkins University has studied the antennae of cockroaches to develop a system that uses locomotion as a means of controlling the position of a passive sensor, according to their report in The Journal of Experimental Biology.
The researchers said they were inspired by observations of cockroach antennae bending as the insects run, which they do to prevent crashing into walls. According to study author Jean-Michel Mongeau, the positioning of the antennae may be more of a result of passive forces than the cockroaches’ nervous system.
“’When animals move slowly they rely mostly on their nervous system for accomplishing tasks, but as the animals are pushed to more extreme performances they face potential constraints in their nervous system, for example sensory conduction delays,” explained Mongeau, a researcher at UC Berkeley.
In the study, the researchers placed blind cockroaches into an arena that was filmed by two high-speed cameras. A gentle prod from a researchers sent the cockroaches scurrying along a wall with a 30 to 60 degree bend in the middle. The cockroaches’ antennae often projected out straight as they tracked along a smooth acrylic wall. However, the antennae acted differently on a wooden wall.
“When the wall becomes rougher, which you could think of as more ecologically relevant for this animal, the antenna would bend backwards almost all of the time, in a sort of inverted J-shape,” Mongeau said.
After measuring the body-to-wall distance, the researchers realized that the bending of the antenna caused the cockroaches to position themselves further away from the wall and prevent them from crashing into it, unlike the cockroaches with straight antennae.
“After these findings, we became interested in understanding the mechanism behind the antenna changing shape,” Mongeau said. “We hypothesized that very tiny tactile hairs on the antenna would potentially be able to engage with, and stick to, (a rough) surface, and when that is coupled with forward motion this would be sufficient to make the antenna flip.”
To test their theory, the team chose to remove these little tactile hairs – a process that turned out to be more difficult than it would have seemed.
“The first thing I tried to do was use tiny forceps to pluck the hairs out, but that turned out to be impossible because these hairs are very robust and they’re embedded within the exoskeleton,” Mongeau said. “After going through several rounds of trial and error, or mostly error, I decided to try a laser system that burns these little hairs at the tip.”
After performing the laser hair removal, the hairless antennae rarely bent backwards, even as they were dragged along a wall.
The team then applied their findings to a robotics design and found they were able get their artificial antenna to bend in a similar fashion. The researchers also noted that a slight change to the orientation of the hairs can cause the antenna to fully curl over into an inverted C, making the antenna useless.
Scientist In ‘Drowning Polar Bear’ Controversy Clears Name And Reaches A Settlement
Brett Smith for redOrbit.com – Your Universe Online
Alaska climate scientist Charles Monnett has settled a lawsuit with the US Department of the Interior over the now famous ‘Drowning Polar Bear’ controversy. Monnett alleges that the government office was tried to silence him to protect its agenda.
Monnet was temporarily suspended in 2011 during an inspector general’s inquiry into a polar bear research contract he oversaw while working with what is currently the Bureau of Ocean Energy Management (BOEM). An employee within the Department of the Interior asserted that the scientist wrongfully leaked government records and that he and a collaborator intentionally left out or supplied false data in a paper documenting the drowning of polar bears.
In the paper, Monnett and a colleague reported seeing four dead polar bears while conducting an aerial survey in 2004. The bears were seen floating in the water after a storm and were presumed drowned while trying to swim long distances between ice packs. The Monnett paper concluded that drowning-related polar bear deaths may become more prevalent in the future if the regression of Arctic ice continues. The paper is said to have galvanized the climate change movement regarding the plight of the drowned polar bears.
The federal inquiry ultimately found no evidence of scientific misconduct. However, Monnett was chastised for the improper release of emails that would eventually be used by an appeals court to stop an Arctic oil and gas exploration plan approved by BOEM.
The scientist was eventually allowed to return to work, but work focusing on the Arctic had been reassigned, according to Jeff Ruch, executive director of the advocacy group Public Employees for Environmental Responsibility (PEER), which helped Monnett file a complaint last year.
According to details of the settlement release by PEER, Monnett will receive $100,000 but cannot work for the Interior Department for five years. He also agreed to retire, effective Nov. 15. For their part, the federal agency agreed to vacate the letter of reprimand and give Monnett a certificate for his work on the contested project.
“This agency attempted to silence me, discredit me and our work and send a chilling message to other scientists at a key time when permits for oil and gas exploration in the Arctic were being considered,” Monnett said in a statement released by PEER. “They failed on the first two goals, but I believe that what they did to me did make others afraid to speak up, even internally.”
“Following over two years of hell for me and my family, my name has been cleared and the accusations against the scientific findings in our paper have been shown to be groundless,” Monett added. “However, I can no longer in good conscience work for an agency that promotes dishonesty, punishes those who actually stand up for scientific integrity, and that cannot tolerate scientific work not pre-shaped to serve its agenda.”
“Dr. Monnett made it clear that he wanted to return to meaningful scientific work again but could not foresee that being possible anymore inside Interior,” Ruch said in the same statement. “If there was any doubt, the five-year employment ban on such a well-qualified, award-winning scientist makes it unmistakably clear that independent scientific views are not welcome in any corner of the Department of Interior.”
Missing Brain ‘Brake’ Could Be Source Of Phobias And Anxiety Disorders
redOrbit Staff & Wire Reports – Your Universe Online
A team led by researchers from the Medical University of Vienna has discovered one possible source of anxiety disorders and severe phobias – a missing inhibitory connection or “brake” in the brain.
When experienced at a manageable level, fear can make people alert and help protect them against danger, it can also disrupt an individual’s sensory perception and reduce happiness when it becomes disproportionate. Now the investigative team has found a possible trigger located in the amygdala and the orbitofrontal cortex in the frontal lobe, which together serve as a control center of sorts for emotional regulation.
In healthy subjects, they found that the circuit had “negative feedback” and “calmness” was identified. Functional magnetic resonance imaging (fMRI) scans on individuals with social phobias reportedly showed the opposite to be true for them. Those men and women possessed differences in an essential inhibitory connection which could help explain why they have such difficulty keeping their fears and anxieties in check.
Lead researcher Christian Windischberger and colleagues were also able to discover how the parts of the brain that are involved in processing emotions can influence one another. The study participants were shown a collection of “emotional faces” such as laughing, crying, happiness and anger while undergoing fMRI scans. As those expressions were being viewed, neuronal activity was triggered in the brain, the researchers explained.
While the test subjects looked no different from one another, the healthy ones were able to maintain their calmness despite the emotional nature of the images thanks to their mental “brake.” On the other hand, the brains of those who suffered from social phobias were deeply influenced by the images, as very strong neuronal activity was observed by Windischberger’s team.
“We have the opportunity not only to localize brain activity and compare it between groups, but we can now also make statements regarding functional connections within the brain,” said primary author Ronald Sladky. “In psychiatric conditions especially, we can assume that there are not complete failures of these connections going on, but rather imbalances in complex regulatory processes.”
The study, which appears in the latest edition of the journal Cerebral Cortex, will lead to an improved understanding of these neuronal mechanisms and might help medical experts develop new methods of treating anxiety disorders and phobias. The goal, the researchers said, is to better understand was impact drugs and psychotherapy will have on the networks involved so that people can get a better grasp on their fears.
Harvard-Smithsonian Study Says iPads Can Help Students Learn
[ Watch the Video: iPads Can Be A Useful Classroom Tool ]
Peter Suciu for redOrbit.com – Your Universe Online
There’s not exactly an app for it, but according to researchers, the iPad can actually help some students learn science. The tablet computer apparently helps students better grasp some nuances of science better than traditional classroom instruction.
This is according to a new study by researchers at the Harvard-Smithsonian Center for Astrophysics.
One example of how the iPad could be used is in judging the scale of the universe. While traditional classrooms may use a basketball to represent the Earth and a tennis ball for the moon, most students find it hard to believe that the scale would put the balls nearly 30 feet apart! With an iPad’s 3D simulation, students are actually able to grasp the unimaginable emptiness of space much better.
This study was conducted as educators are increasingly faced with the issue of whether tablet computers have a place in the classroom. The findings suggest that the iPad – as well as other tablet computers – could improve the students understanding of challenging scientific concepts such as astronomical scale.
“These devices offer students opportunities to do things that are otherwise impossible in traditional classroom environments,” said the study’s leader Matthew H. Schneps in a statement. “These devices let students manipulate virtual objects using natural hand gestures, and this appears to stimulate experiences that lead to stronger learning.”
The study, which will be published in the January 2014 issue of Computers and Education, had Schneps and his colleagues consider gains in learning amongst 152 high-school students who used iPads to explore simulated space, and compared the findings to 1,184 students who relied on the more traditional classroom approaches. The researchers further focused on questions that were dominated by strong misconceptions including the understanding of scale in space.
The researchers reported that most traditional approaches produced less gain in understanding, whereas the iPad classrooms reported strong gains in understanding. Students also reportedly struggled with concepts of scale when learning ideas in biology, chemistry, physics and geology. This suggested that the iPad-based simulations could be beneficial for teaching concepts in many other scientific fields that go beyond astronomy.
The study further highlighted that student understanding improved with as little as 20 minutes of iPad use, and that combined with guided instruction with the devices it could produce even more dramatic gains in overall student comprehension.
“While it may seem obvious that hands-on use of computer simulations that accurately portray scale would lead to better understanding,” added Philip Sadler, a co-author of the study, “We don’t generally teach that way.”
Instead he said all too often instruction has made use of models and drawings that often distort the scale of the universe.
This he said “leads to misconceptions.”
However, further research might be required before this own study creates any misconceptions.
“The results of the Smithsonian/Harvard research are certainly interesting, but the study also sparks some questions yet unanswered, including how computer simulations that accurately portray scale might be applied to other areas, how much experience with touch enabled tablets and phones the participants had prior to the experiments, and whether the results were particular to the device – tablet – or manufacturer – Apple – involved,” Charles King, principal analyst at Pund-IT, told redOrbit. “Overall, I’d say the project offers hope for new, potentially beneficial avenues to explore in the study of certain sciences. However, I also believe we’ll see additional, related studies that deliver more detailed, finely-grained results.”
The students in the iPad study attend Bedford High School, in Bedford, Mass., which is currently just one of a few school systems around the country that now equip all the students with an iPad tablet device. The school has reported that it has also seen improvements with students beyond the Smithsonian-Harvard study.
“Since we began using iPads, we have seen substantial gains in learning, especially in subjects like math and science,” said Henry Turner, principal at Bedford High School.
Fishing Has Reduced Vital Seaweed Eaters By More Than Fifty Percent
Scripps Institution of Oceanography, UC San Diego
In the first global assessment of its kind, a science team led by researchers at Scripps Institution of Oceanography at UC San Diego has produced a landmark report on the impact of fishing on a group of fish known to protect the health of coral reefs. The report, published in the journal Proceedings of the Royal Society B (Biological Sciences), offers key data for setting management and conservation targets to protect and preserve fragile coral reefs.
Beyond their natural beauty and tourist-attraction qualities, coral reefs offer economic value estimated at billions of dollars for societies around the world. Scripps Master’s student Clinton Edwards, his advisor Jennifer Smith, and their colleagues at the Center for Marine Biodiversity and Conservation at Scripps, along with scientists from several international institutions, have pieced together the first global synthesis on the state of plant-eating fish at coral reef sites around the world. These herbivorous fish populations are vital to coral reef health due to their role in consuming seaweed, making them known informally as the “lawnmowers” of the reef. Without the lawnmowers, seaweeds can overgrow and out-compete corals, drastically affecting the reef ecosystem.
Among their findings, the researchers found that populations of plant-eating fish declined by more than half in areas that were fished compared with unfished sites.
“One of the most significant findings from this study is that we show compelling evidence that fishing is impacting some of the most important species on coral reefs,” said Smith. “We generally tend to think of fishing impacting larger pelagic fishes such as tuna but here we see big impacts on smaller reef fish as well and particularly the herbivores. This is particularly important because corals and algae are always actively competing against one another for space and the herbivores actively remove algae and allow the corals to be competitively dominant. Without herbivores, weedy algae can take over the reef landscape. We need to focus more on protecting this key group of fishes around the globe if we hope to have healthy and productive reefs in the future.”
“While these reef fish are not generally commercial fisheries targets,” said Edwards, “there is clear evidence from this study that fishing is impacting their populations globally.” Edwards, a UC San Diego graduate, recently completed his Masters thesis at Scripps where he said his experience and the opportunities he was given to conduct research were unparalleled.
The researchers also found that fishing alters the entire structure of the herbivore fish community, reducing the numbers of large-bodied feeding groups such as “grazers” and “excavators” while boosting numbers of smaller species such as algae-farming territorial damselfishes that enhance damaging algae growth.
“These results show that fished reefs may be lacking the ability to provide specific functions needed to sustain reef health,” said Edwards.
“We are shifting the herbivore community from one that’s dominated by large-bodied individuals to one that’s dominated by many small fish,” said Smith. “The biomass is dramatically altered. If you dive in Jamaica you are going to see lots of tiny herbivores because fishers remove them before they reach adulthood. In contrast, if you go to an unfished location in the central Pacific the herbivore community is dominated by large roving parrotfishes and macroalgal grazers that perform many important ecosystem services for reefs.”
The authors argue that such evidence from their assessment should be used in coral reef management and conservation, offering regional managers data to show whether key herbivores are fished down too low and when they’ve successfully recovered in marine protected areas.
“This assessment allows us to set management goals in different regions across the globe,” said Smith. “Regional managers can use these data as a baseline to set targets to develop herbivore-specific fisheries management areas. We should be using these important fish as a tool for reef restoration. On reefs where seaweed is actively growing over reefs, what better way to remove that seaweed than to bring back those consumers, those lawnmowers?”
In addition to Edwards and Smith, coauthors include Brian Zgliczynski and Stuart Sandin from Scripps; Alan Friedlander of the U.S. Geological Survey; Allison Green of the Nature Conservancy; Marah Hardt of OceanInk; Enric Sala of the National Geographic Society; Hugh Sweatman of the Australian Institute of Marine Science; and Ivor Williams of the Pacific Islands Fisheries Science Center.
The research was supported by the National Science Foundation and NOAA through the Comparative Analysis of Marine Ecosystem Organization (CAMEO) program.
—
On The Net:
Android Builder Andy Rubin Now In Charge Of Google Robotics Project
Bryan P. Carpender for redOrbit.com – Your Universe Online
Google has been up to something.
First, Andy Rubin, the man behind Google’s Android smartphone revolution, unexpectedly stepped down from his post as Senior Vice President of Mobile and Digital Content this past March to take on a nebulous new role at the company, generating whispers and speculation.
Then it was recently revealed over the past six months, Google has been stealthily acquiring seven different technology firms in both the US and Japan – firms specializing in robotics.
Now, we’re beginning to get a glimpse at what Google has been up to; it is launching a new robotics effort. And handpicking the man who built the Android software to spearhead Google’s new “moonshot” attempt to create a new generation of robots is a virtual no-brainer.
Andy Rubin has built his career on his passion for robotics. He was a robotics engineer for German manufacturer Carl Zeiss before spending several years at Apple Computer in the 1990s, where he was originally a manufacturing engineer, developing interfaces and operating systems for mobile devices. He also co-founded the firm behind the T-Mobile Sidekick before co-founding Android Inc. in 2003, which received financial backing from Google after purchasing the firm in 2005.
His passion for building intelligent machines is no secret.
“I have a history of making my hobbies into a career,” Mr. Rubin told the New York Times. “This is the world’s greatest job. Being an engineer and a tinkerer, you start thinking about what you would want to build for yourself.”
In short: he’s the right man for the job. Plus, he comes with seventeen patents for his inventions, which should eradicate any doubts about his qualifications.
So now that Google has unveiled the man leading the new robotics project, we want to know what Google’s plans are. Not surprisingly, it is being very tight-lipped, refusing to give up any information regarding specific plans for the new venture, but it’s a safe bet that’s it’s more than a fleeting side project.
Regarding a specific timeline for the project, Rubin was circumspect, offering only this: “Like any moonshot, you have to think of time as a factor. We need enough runway and a 10-year vision.”
At this point, it’s likely the new project is skewing away from the direct consumer, instead focusing on opportunities in manufacturing and supply chain logistics – two areas providing huge opportunity, as they are not being served by existing robotic technologies, instead relying heavily on manual work with comparatively little automation.
“The opportunity is massive. There are still people who walk around in factories and pick things up in distribution centers and work in the back rooms of grocery stores,” says Andrew McAfee, a principal research scientist at the MIT Center for Digital Business, regarding the potential for Google’s new venture.
Rubin feels hardware has made technological advances, with issues such as mobility and moving hands and arms no longer being obstacles. However, he did acknowledge that areas including software and sensors still have a way to go, but he seems confident that such breakthroughs are forthcoming. He compared this new undertaking with the Google X self-driving car project, which began in 2009.
“The automated car project was science fiction when it started,” Rubin said. “Now it is coming within reach.”
Google has yet to determine if it will keep the effort inside the Googleplex or if it will give it its own separate identity by spinning it off into its own subsidiary. For now, the robotics team will be based in Palo Alto, California with offices in Japan.
The seven strategic acquisitions by Google include tech companies Schaft, a team of former Tokyo University roboticists making a humanoid robot; Meka, which makes robotic manipulators intended to work side by side with humans; Industrial Perception, a startup focusing on computer vision and robots capable of loading and unloading trucks; Redwood Robotics, a maker of robotic arms; and Holomni, which makes powered multi-directional caster wheels that can drive vehicles.
Also on the roster are Autofuss and its sister company, Bot & Dolly, whose robotic arms are used in cinema, most recently helping accomplish the stunning visuals in the film “Gravity” by controlling the camera with high precision and even automating part of the set.
While we await more details about Google’s robotics ambitions, we can rest assured that the new venture has the “thumbs-up” from the highest levels.
Google CEO Larry Page took to Google+ yesterday to publicly share his optimism: “I am excited about Andy Rubin’s next project. His last big bet, Android, started off as a crazy idea that ended up putting a supercomputer in hundreds of millions of pockets. It is still very early days for this, but I can’t wait to see the progress.”
Neither can we.
Jeff Bezos’ Space Company Blue Origin Tests Rocket Engine
This is shaping up to be quite a week for Jeff Bezos, as the Amazon founder announced on Tuesday that his aerospace company Blue Origin successfully tested its new, hydrogen- and oxygen-fueled engine, designed to take a ship, crew and cargo into sub-orbit. The announcement came just two days after Bezos revealed that Amazon has been testing unmanned delivery drones.
In the test, the engine performed a full-mission duty cycle – thrusting at 110,000 pounds in a 145-second boost period, shutting down for about four and a half minutes to replicate a rocket coasting through its highest, sub-orbital point, then restarting and throttling down to 25,000 pounds of thrust to replicate a controlled vertical landing. So far, the Blue Origin engine has conducted more than 160 starts and 9,100 seconds of operation at the company’s test facility near Van Horn, Texas.
“Blue Origin has made steady progress since the start of our partnership under the first Commercial Crew Development round,” said Phil McAlister, NASA’s director of Commercial Spaceflight Development. “We’re thrilled to see another successful BE-3 engine test fire.”
“Working with NASA accelerated our BE-3 development by over a year in preparation for flight testing on our New Shepard suborbital system and ultimately on vehicles carrying humans to low-Earth orbit,” added Rob Meyerson, president and program manager of Blue Origin. “The BE-3 is a versatile, low-cost hydrogen engine applicable to NASA and commercial missions.”
The test is just the latest development in the decade-long Blue Origin project dubbed New Shepard. The launch system is being designed to send tourists and researchers to the edge of space, more than 62 miles above the Earth’s surface. The company plans to eventually develop a system capable of sending astronauts to the International Space Station (ISS), slated for launch sometime after 2018.
“The BE-3 will gain extensive flight heritage on our New Shepard suborbital system prior to entering service on vehicles carrying humans to low-Earth orbit,” Meyerson said. “Given its high-performance, low cost, and reusability the BE-3 is well suited for boost, upper-stage and in-space applications on both government and commercial launch systems.”
The Bezos-founded company is currently vying with the California-based SpaceX, which was founded by PayPal creator Elon Musk, for the use of NASA’s Launch Complex 39A at the Kennedy Space Center in Florida. The space agency has announced that it wants to pass the maintenance and operation of the historic launch pad to a commercial organization as quickly as possible, but Blue Origin has contested NASA’s current method for selecting its choice. The Government Accountability Office (GAO) is expected to rule on Blue Origin’s appeal by December 12.
SpaceX has said it plans to use 39A for a range of commercial launches, including its current missions to resupply the ISS. On the other hand, Blue Orbit has talked about turning the pad into a multi-user launch facility.
“We believe we’ve submitted a proposal that provides the fullest commercial use of the facility,” Meyerson told NBCNews.com on Tuesday. “If the outcome is that our proposal is not selected, we have many other options, and we would look at those other options.”
Frequent Mammography Screening Could Affect Breast Cancer Prognosis
Radiological Society of North America
In a study of screening mammography-detected breast cancers, patients who had more frequent screening mammography had a significantly lower rate of lymph node positivity—or cancer cells in the lymph nodes—as compared to women who went longer intervals between screening mammography exams. Results of the study were presented today at the annual meeting of the Radiological Society of North America (RSNA).
In its earliest stages, breast cancer is confined to the breast and can be treated by surgically removing the cancer cells. As the disease progresses, breast cancer cells may spread to the lymph nodes and then to other areas of the body.
“On its pathway to other places in the body, the first place breast cancer typically drains into before metastasizing is the lymph nodes,” said Lilian Wang, M.D., assistant professor of radiology at Northwestern University/Feinberg School of Medicine in Chicago, Ill. “When breast cancer has spread into the lymph nodes, the patient is often treated both locally and systemically, with either hormone therapy, chemotherapy, trastuzumab or some combination of these therapies.”
Historically, healthcare organizations, such as RSNA and the American Cancer Society (ACS), have recommended annual screening with mammography for women beginning at age 40. However, in 2009, the United States Preventive Services Task Force (USPSTF) announced a controversial new recommendation for biennial screening for women between the ages of 50 and 74.
“Our study looks at what would happen if the revised guidelines issued by USPSTF were followed by women,” Dr. Wang said.
The retrospective study, conducted at Northwestern Memorial Hospital, included 332 women with breast cancer identified by screening mammography between 2007 and 2010. The women were divided into one of three groups, based on the length of time between their screening mammography exams: less than 1.5 years, 1.5 to three years and more than three years. There were 207, 73 and 52 patients in each category, respectively.
Controlling for age, breast density, high-risk status and a family history of breast cancer, the researchers determined that women in the less than 1.5-year interval group had the lowest lymph node positivity rate at 8.7 percent. The rate of lymph node involvement was significantly higher in the 1.5- to three-year and over three-year interval groups at 20.5 percent and 15.4 percent, respectively.
“Our study shows that screening mammography performed at an interval of less than 1.5 years reduces the rate of lymph node positivity, thereby improving patient prognosis,” Dr. Wang said. “We should be following the guidelines of the American Cancer Society and other organizations, recommending that women undergo annual screening mammography beginning at age 40.”
—
On the Net:
Sudan Reports Widespread Yellow Fever Outbreak, 14 Dead
Lawrence LeBlond for redOrbit.com – Your Universe Online
Sudan’s Federal Ministry of Health (FMOH) has notified the World Health Organization (WHO) of an outbreak of yellow fever that is affecting 12 localities in West and South Kordofan states.
A total of 44 suspected cases and 14 deaths have been reported from October 3 to November 24, 2013 in the localities of Lagawa, Kailak, Muglad and Abyei in West Kordofan and Elreef Alshargi, Abu Gibaiha, Ghadir, Habila, Kadugli, Altadamon, Talodi and Aliri in South Kordofan.
Field investigations carried out by the FMOH revealed that the initial suspected cases were reported among seasonal workers coming from the eastern states of Sudan who had traveled to West Kordofan for work in October. Subsequent cases were reported among locals in both West and South Kordofan states, following the arrival of the workers.
Blood samples that were collected during the field investigation tested positive for Yellow Fever by IgM ELISA Assay at the National Public Health Laboratory of the FMOH in Khartoum. The samples were retested at the Institute of Pasteur in Senegal and were confirmed to be that of Yellow Fever. Subsequent seroneutralizing (PRNT) testing by WHO researchers also confirmed presence of yellow fever.
The field investigation also found evidence of Aedes aegepty mosquitoes in the areas where the infected persons were found. A. aegepty is one vector that can sustain transmission of yellow fever.
WHO is assisting the FMOH to strengthen surveillance efforts and to conduct active case searches in and around the region. So far no suspected cases have been reported from any of the areas outside of where the initial outbreak occurred. The FMOH is now organizing a massive vaccination program against yellow fever in the affected areas to prevent further infection.
According to a WHO report, it is estimated that yellow fever infects between 840,000 and 1.7 million people in Africa each year, resulting in about 29,000 to 60,000 deaths.
An outbreak last year in the Darfur region of Sudan resulted in 849 suspected cases and 171 deaths. Around five million people were vaccinated against yellow fever in the five states of Darfur following the outbreak. In 2005, a yellow fever outbreak was also reported from the South Kordofan state, resulting in 615 suspected cases and 183 deaths. A vaccination campaign followed targeting about 1.6 million people in the region.
Yellow fever, also known as Yellow Jack, is an acute viral hemorrhagic virus that affects 20 percent of an area’s population where it is commonly found. Most cases only cause a mild infection with fever, headache, chills, back pain, loss of appetite, nausea and vomiting. In these cases, the infection generally lasts three or four days.
In about 15 percent of cases, sufferers can enter a toxic phase of the disease with recurring fever accompanied by jaundice due to liver damage and abdominal pain. Bleeding in the mouth, eyes and gastrointestinal tract is also common at this stage and vomit may contain blood. This toxic phase is lethal in about 20 percent of cases, making the overall mortality rate for the disease about three percent. In severe epidemic outbreaks, mortality may rise to 50 percent or more.
For those who survive their infection, they usually do so without any organ damage and they are provided with a lifelong immunity to the virus.
More Coronavirus Detected In Camels, Humans In Middle East
Lawrence LeBlond for redOrbit.com – Your Universe Online
Middle East Respiratory Syndrome coronavirus (MERS-CoV) has been detected in a herd of camels in a barn in Qatar. The virus in the camels has been linked to two confirmed human infections from October 18 and 29, 2013.
Researchers from Erasmus Medical Center in the Netherlands have confirmed the presence of MERS-CoV in three camels in a herd of 14 animals with which both human cases had contact. The confirmations were made with support from the National Institute of Public Health and Environment (RIVM), the World Health Organization (WHO) and the Food and Agriculture Organization (FAO).
As a precautionary measure, the 14 camels from the farm have been isolated. All of the camels were asymptomatic or had mild symptoms when the samples were taken and remained so during the following 40 days. Researchers have also screened all farm workers and others who were in close contact with the two human cases – lab tests were negative for all contacts.
This latest finding indicates the high likelihood that camels can be infected with MERS-CoV. However, the researchers maintain there is still too little information to indicate what role the camels have in the possible transmission of the virus. The Supreme Council of Health is closely working with RIVM and Erasmus to test additional samples from other animal species and from the environment around the barn. Also, tests are being conducted on a national level to investigate the infection risk among individuals who are in close contact with animals.
People who are at high risk of infection from the MERS-CoV should avoid contact with animals when visiting farms where the virus is thought to potentially be circulating. For all others, general hygienic measures should be followed, including regular hand washing before and after touching animals, avoiding contact with sick animals and following food hygiene guidelines.
ADDITIONAL CASES
On Dec 1, 2013 WHO was informed of an additional three lab-confirmed cases of infection with MERS-CoV in the United Arab Emirates.
All three cases are from one family in Abu Dhabi – a 32-year-old mother who died on Dec 2; a 38-year-old father who is in critical condition; and their eight-year-old son, who has mild respiratory symptoms. The earliest onset of illness was on Nov 15. There was no travel history linked to either the mother or father and there has been no known contact with another confirmed case or with animals.
During hospitalization, the mother gave birth to a child. The eight-year-old son’s illness was detected during an epidemiological investigation of family contacts. Further tests are ongoing for others who were in close contact with the family, including healthcare workers and the newborn.
The WHO was also informed of two deaths from previously-confirmed cases of MERS-CoV. Both patients were from Qatar and died on Nov 15 and Nov 21, respectively.
Globally, from September 2012 to date, WHO has been informed of a total of 163 lab-confirmed cases of MERS-CoV, including 71 deaths.
Study Highlights Discouraging Collapse Of Saharan Wildlife
April Flowers for redOrbit.com – Your Universe Online
The world’s largest tropical desert, the Sahara, has suffered a catastrophic collapse of its wildlife populations, according to a new study led by the Wildlife Conservation Society (WCS) and Zoological Society of London (ZSL).
The research team consisted of 40 scientists from 28 international organizations. They assessed 14 desert species, finding that half of those are regionally extinct or confined to one percent or less of their historical range. It is difficult to be certain of the causes of these declines because of a chronic lack of studies across the region due to political instability. The team suggests, however, that over-hunting is likely to have played a major role.
The Bubal hartebeest is completely extinct; the scimitar horned oryx is only found in captivity; and the African wild dog and African lion have disappeared from the Sahara. The study, published in Diversity and Distributions, reveals that other species have fared only marginally better. The dama gazelle and addax are gone from 99 percent of their range; the leopard has lost 97 percent of its range; and the Saharan cheetah has disappeared from 90 percent.
The only species that still inhabits most of its historical range is the Nubian ibex, but even this species is classified as vulnerable due to numerous threats including widespread hunting.
More conservation support and scientific attention for the desert is necessary, according to the team. They note that 2014 is the halfway point in the United Nations Decade for Deserts and the Fight against Desertification and the fourth year of the United Nations Decade for Biodiversity.
“The Sahara serves as an example of a wider historical neglect of deserts and the human communities who depend on them,” said Sarah Durant of WCS and ZSL. “The scientific community can make an important contribution to conservation in deserts by establishing baseline information on biodiversity and developing new approaches to sustainable management of desert species and ecosystems.”
Some governments in the region have recently made large commitments to protect the Sahara. For example, Niger has established the massive 37,451 square-mile Termit and Tin Toumma National Nature Reserve, which harbors most of the world’s 200 or so remaining wild addax and one of a handful of surviving populations of dama gazelle and Saharan cheetah. With the support of the Chadian government, the scientists hope to reintroduce the scimitar horned oryx in the wild in the Ouadi Rimé-Ouadi Achim Game Reserve.
Meningitis Outbreak At UCSB Results In Double Foot Amputation
Lawrence LeBlond for redOrbit.com – Your Universe Online
A bacterial meningitis outbreak that has been sweeping its way across Princeton University, infecting at least eight students since earlier this year, is now on the loose at University of California Santa Barbara.
UCSB authorities say a fourth student has come down with a strain of meningitis similar to the one that has been ongoing at Princeton in New Jersey. While students on the east coast have all received care and have made full recoveries, at least one student from UCSB had to have both his feet amputated as a result of lack of blood supply to his limbs due to infection.
“He’s from my hometown. I hope he is doing well,” UCSB student David Burkow, told ABC News. “It’s just kind of scary because there is a constant fear.”
The Santa Barbara County Public Health Department declared on Monday that all four students became ill last month. Since the outbreak, more than 300 students who had contact with those who fell ill were given antibiotics.
Now, the California University has urged students to refrain from attending social events, which includes all sorority and fraternity parties, to try to keep the disease from spreading, according to an official statement on Monday.
Bacterial meningitis can be caught by kissing, coughing and prolonged contact, as well as through cup sharing, which is common at college parties. Symptoms include headache, fever, stiff neck, nausea and vomiting.
Fraternity member Jared Dinges told ABC News’ Sydney Lupkin that he has a few “rules of thumb” for keeping clear of the potentially deadly disease.
“Just don’t share bottles,” he said. “Try to avoid kissing new girls — things like that. Just be safe.”
While UCSB officials are on high alert, experts with the US Centers for Disease Control and Prevention (CDC) have maintained that the California meningitis outbreak is not connected to the Princeton outbreak, as the two strains do not share a similar “fingerprint.”
As for the Princeton outbreak, the FDA has approved an internationally available meningitis vaccine to be shipped to the US for the Ivy League students. Princeton is expected to receive about 6,000 doses for its students around December 9, 2013.
Meningitis kills at least one in 10 people who contract it and leaves about 20 percent of survivors with permanent health problems, including limb loss and mental retardation, according to the CDC.
UCSB officials are urging all students to seek medical care at the first sign of symptoms. While the school is stopping short of forcibly stopping all social events, it is making sure students are aware that meningitis can be transmitted through close social contact, including sharing alcoholic drinks, cigarettes and eating off the same plate or sharing utensils.
“All the existing cases appear to have had close personal contact,” the school noted in an email to the campus community.
Meningitis is not as common in the US as it once was, but figures reported by NBC News show that between 800 and 1,200 cases are still reported annually in the US. Vaccines that are available for meningitis in the US only cover four strains: A, C, Y and W-135. The B strain vaccine, which is available in Europe and Australia, is not yet approved for use in the general US community.
NBC News also reported that another case of meningitis from New Jersey-based Monmouth University is from the C strain of meningitis. That person remains hospitalized, but is recovering, according to a college spokeswoman.
Ocean Currents May Shape Europa’s Icy Shell
[ Watch the Video: Saturn’s Moon Europa Has A Dynamic Subsurface Ocean ]
April Flowers for redOrbit.com – Your Universe Online
Researchers from The University of Texas at Austin’s Institute for Geophysics (UTIG), the Georgia Institute of Technology, and the Max Planck Institute for Solar System Research have revealed that the subsurface ocean on Jupiter’s moon Europa might have deep currents and circulation patterns. These currents and patterns have heat and energy transfers capable of sustaining life, a finding of relevance to the search for life in our solar system.
Europa is believed to be one of the most likely planetary bodies in our solar system to sustain life. Magnetometer readings from the Galileo spacecraft that detected signs of a salty, global ocean below the moon’s icy shell reinforce this belief.
Scientists have to rely on magnetometer data and observations of the surface to account for oceanic conditions below the icy shell, due to a lack of direct measurements.
One of Europa’s most prominent features is chaos terrains, or regions of disrupted ice on the surface. Krista Soderlund of UTIG explains that chaos terrains, which are concentrated in Europa’s equatorial region, could result from convection in Europa’s ice shell, accelerated by heat from the ocean. Diapirs, or warm compositionally buoyant plumes of ice that rise through the shell, might be formed by the heat transfer and possible marine ice formation.
The research team created a numerical model of Europa’s ocean circulation, finding that warm rising ocean currents near the equator and subsiding currents in latitudes closer to the poles could account for the location of chaos terrains and other features of Europa’s surface. Coupled with regionally more vigorous turbulence, such a pattern intensifies heat transfer near the equator. This could help initiate upwelling ice pulses that create features such as the chaos terrains.
“The processes we are modeling on Europa remind us of processes on Earth,” says Soderlund. A similar process has been observed in the patterns creating marine ice in parts of Antarctica, she noted.
The patterns observed on Jupiter and Saturn contrast with the current patterns modeled for Europa. On Jupiter and Saturn, bands of storms form because of the way their atmospheres rotate. Europa’s ocean physics seem to have more in common with the oceans of the “ice giants,” Uranus and Neptune. These oceans show signs of 3D convection.
“This tells us foundational aspects of ocean physics,” notes Britney Schmidt, assistant professor at the Georgia Institute of Technology. If the study’s hypothesis is correct, says Schmidt, it reveals that Europa’s oceans are very important as a controlling influence on the surface ice shell. This provides proof of the concept that ice-ocean interactions are important to Europa.
“That means more evidence that the ocean is there, that it’s active, and there are interesting interactions between the ocean and ice shell,” says Schmidt, “all of which makes us think about the possibility of life on Europa.”
Soderlund, who has studied icy satellites throughout her science career, is anticipating the opportunity to test her hypothesis through future missions to the Jovian system. The European Space Agency’s (ESA) Jupiter Icy moons Explorer (JUICE) mission will provide tantalizing glimpses into the characteristics of the ocean and ice shell through two flyby observations. A concept under study at NASA, the Europa Clipper mission, would complement the view with global measurements.
Soderlund says she appreciates the chance “to make a prediction about Europa’s subsurface currents that we might know the answer to in our lifetimes — that’s pretty exciting.”
The findings of this study were published online in Nature Geoscience.
Image 2 (below): Zonal flows in Europa-like ocean simulation. Credit: University of Texas Institute for Geophysics