Virginia Has The Fastest Internet In The US, While Alaska Has The Slowest

redOrbit Staff & Wire Reports – Your Universe Online
The official Virginia tourism website proudly touts the state’s beaches, theme parks, and family-friendly activities as reasons to plan a trip there, but a new report from communications and cloud services provider Broadview Networks adds another potential draw: the highest Internet connection speeds in America.
According to ABC News reporter Alyssa Newcomb, the South Atlantic state that claims to be “for lovers” is home to the fastest broadband connections in the US, followed by Delaware, Massachusetts, Rhode Island and Washington DC.
During the first quarter of 2014, Virginia had an average data transfer speed of 13.7 megabits per second (Mbps), the Broadview Networks report said. Delaware and Massachusetts tied for second with speeds of 13.1 Mbps, followed by Rhode Island with 12.9 Mbps, DC with 12.8 Mbps, Washington state with 12.5 Mbps and New Hampshire with 12.3 Mbps.
Alaska had the slowest Internet connections with an average data transfer speed 7 Mbps – less than half that of Virginia’s, according to Newcomb. Arkansas, Kentucky and Montana were only slightly better with 7.3 Mbps connection speeds, followed by West Virginia (7.5 Mbps), Mississippi (7.6 Mbps) and New Mexico (7.6 Mbps).
“Internet speed has been a hot topic in the news as of late, with major providers intentionally throttling speeds and the heated debate surrounding net neutrality,” the company explained. “It isn’t surprising that so many people are interested in the topic. After all, the internet occupies much of an American’s daily life… But we all know that simply having internet access doesn’t cut it – the speed have to be fast and consistent.”
Using data obtained from Akamai’s “State of the Internet” report, the IT company created a color-coded map illustrating the average Internet speeds by state. States appearing green on the map are those with the fastest average connection speeds, and the darker the shade of green, the faster the speed. Those colored in red have the slowest Internet speeds.

The states that tend to have the slowest connection speeds tend to have either lower population numbers, less money, or both, according to HotHardware’s Paul Lilly. Lilly also noted that many of those near the bottom of the list (indicated with darker shades of red) are concentrated in the southern part of the US. Similarly, some of the fastest speeds in the country can be found in the Northeast.
“Nearly every state has shown steady improvement,” said Niraj Chokshi of The Washington Post. “All but two saw speeds accelerate between the last quarter of 2013 and the first quarter of this year, and even in the case of the laggards the declines were a modest four percent slowdown in Virginia and two percent in Louisiana. Only Vermont saw slower speeds in the first quarter than a year earlier.”
The original Akamai report said that 26 of the 50 US states had average speeds topping 10 Mbps, which is categorized as having “high broadband,” said CNET reporter Dara Kerr. Conversely, no states fell within the “low broadband” category of average speeds of less than 4 Mbps. The overall average speed of the US was 10.5 Mbps, good for 10th worldwide but well behind speed leaders South Korea (23.6 Mbps) and Japan (14.6 Mbps), she added.
GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now

Researchers Conduct Low-Gravity Test Of Innovative Space Galley

redOrbit Staff & Wire Reports – Your Universe Online
One of the downfalls of space travel is that astronauts are limited in the types of food they can eat due to the low gravity environment, but researchers from Cornell University are reportedly working on a solution that could one day allow the men and women traveling to Mars or the moon to enjoy a home-cooked meal.
According to Stacey Shackford, a writer for Cornell’s College of Agriculture and Life Sciences, postdoctoral research associated Apollo Arquiza and Bryan Caldwell recently conducted what is being called the first known partial gravity cooking demonstration.
Arquiza and Caldwell, who work in the laboratory of biological and environmental engineering associate professor Jean Hunter, boarded a zero-gravity space simulator plane in April in order to test the effectiveness of a specially-designed space galley created in collaboration with Susana Carranza of Makel Engineering in Chico, California.
As part of the demonstration, which Caldwell discussed Thursday during a Cornell University press event, the researchers conducted a series of four flights that lifted off from Houston, Texas. He and Arquiza “tossed tofu and shredded potatoes into pans of sizzling oil” and filmed the results, Shackford said.
Each flight included a short period of partial weightlessness, simulating the conditions faced by astronauts during extended stays on the moon or Mars – which the university explains have one-sixth and one-third the gravity of Earth, respectively. The plane climbed and descended in parabolic paths while dishes were being prepared.
The research team filmed the oil spatters resulting from the flight. They also placed strips of paper inside the fume hood of the galley, and dyed the oil bright red in order to help them detect and collect splatter patterns, Shackford said. The results wound up being “a bit messy,” according to the Cornell writer.
Food settled into the pan more slowly under low gravity conditions, and an increased amount of oil appeared to fall outside of it. Oil droplets also appeared to travel further away from the pan in space than they usually do in a regular Earth kitchen, likely because it took gravity longer to pull them downwards, Arquiza explained.
“Arquiza ended up with a collection of 200 red-speckled strips that might resemble evidence from a crime scene investigation, but could contribute greatly to our understanding of the basic science of cooking in space,” Shackford said. The size distribution of the particles and the distance they traveled will now be analyzed and used to develop computer models which could be used in the development of future space-based cooking technology.
“It’s amazing to be able to marry the computational aspects of research with real-life feet-on-the-ground – or in this case, not on the ground – data to get a fuller picture of what is going on,” explained Hunter. “Understanding oil spatter in reduced gravity is a big step toward designing safe and convenient cooking facilities for future space colonies.”
The in-flight demonstration itself was the culmination of several months’ work of research, as the research team fine-tuned and tweaked the details of their space galley. They made sure that the induction cooker used in the trial had the ideal power setting so the oil could be heated quickly without crossing its smoke point.
“Incorporating design elements from submarine galleys and chemical fume hoods used in labs, Arquiza and Carranza created an enclosed unit with activated charcoal filters and a fan that sucks in air from the front and draws particles away from the cook,” Shackford said.
The goal was to design a cooking system that was capable of withstanding nine Gs worth of force and keep the odors created by the frying process under control, she added. Prior to the flight, air flow models were created using computational fluid dynamics and dry-ice fog, while during the actual in-air demonstration, experts from the NASA Reduced Gravity Research program were on hand to assist the Cornell University team.
“The project is part of a larger investigation by Hunter’s lab into scientific and social aspects of food in space, including a simulated Mars mission in Hawaii to test resource use, menu fatigue and the benefits of home cooking in an enclosed environment, and a bed rest study to test the effects of simulated weightlessness on smell and taste perception,” Shackford noted.
FOR THE KINDLE – The History of Space Exploration: redOrbit Press

The Economy Of Bitcoins

Samuel Schlaefli, ETH Zürich

The massive spread of the cryptocurrency or digital currency, Bitcoin, opens up new pathways for researchers to study social action on markets. This reveals interesting feedback between the exchange rates and mentions in social media.

Anyone who strolls around the Kreuzberg district of Berlin, can’t help but notice them – the small signs on the doors of shops and cafes “Bitcoins accepted”. Customers pay for their shirt or their cappuccino with their Smartphone instead of with bank notes or credit cards. The digital currency Bitcoin makes all this possible. “The image of Bitcoin has changed fundamentally”, explains David Garcia, a post-doctoral researcher with the Chair of Systems Design held by Professor Frank Schweitzer. “Bitcoins used to be the reserve of hackers and computer nerds. Today, hipsters pay for drinks with them and they are accepted in the online shops of large companies”. Garcia, together with his colleagues Claudio Tessone, Pavlin Mavrodiev and Nicolas Perony, has just published a study on the social dynamics of the Bitcoin economy in the Journal of the Royal Society: Interface.

Internet activity determines exchange rates

For research the success of the digital currency * is a stroke of luck as all data on every transaction carried out in Bitcoin are available in anonymized form on the Internet. Consequently, Garcia and his colleagues are able to study the Bitcoin economy using corresponding algorithms. This idea saw the light of day when they noticed that the 50,000-fold market value increase in the digital currency in just three and a half years went hand in hand with a 10,000 percent increase in Google searches for Bitcoin. The researchers hypothesize that the increase in the value of Bitcoins is markedly accelerated by activities on the Internet, in particular the search for information and interaction in the social media.

To test their hypothesis the researchers examined four different socio-economic parameters: the development of the Bitcoin user base, the price developments of the currency over time, the search for information about Bitcoin on Google and in Wikipedia (more than six million inquiries) and the exchange of information about Bitcoin on Twitter (almost seven million Tweets). In fact, over the past three the researchers established years major correlations between price developments, the number of new Bitcoin users, searches on the Internet and Tweets.

At the same time, they discovered two positive feedback loops which basically reproduced the laws of the “analogous” economy. The growing popularity of Bitcoins on the Internet leads to growing demand which, in turn, encourages activity in the social media. This all results in a higher price for Bitcoins. The second feedback concerns the user base: the more users become part of the Bitcoin transaction network, the higher the price because Bitcoins are not issued in line with demand but in an automated fashion at regular intervals. This means it is possible to calculate the available amount at any time. One negative feedback is, however, surprising. Prior to a major slump in the price of the currency, there was a dramatic increase in Bitcoin activity on the Internet. “Big changes in Internet and social media activities lead to substantial price fluctuations”, comments Nicolas Perony, co-author of the article.

Understanding markets and social dynamics

Perony is convinced that the quantitative analysis of social phenomena on the Internet has major potential. “With digital currencies we can observe aspects of the economy that we didn’t have access to with cash. This gives us greater understanding of how markets actually function.” According to the authors, the methodology described in the article could be applied to other areas in society, too. The Bitcoin mining network, which issues the currency, already harnesses computing power today which is three hundred times bigger than that of the 500 most powerful supercomputers together. “The big question is how such a high-performance system could be used for collaborative activities which go beyond the production of money”, comments Perony. One possibility would be, for instance, collaborative research in a global network or the decentralized ownership of specific goods managed by a global network. Bitcoms do not belong to anyone. Buyers merely acquire the right to use a specific amount of them. This study already outlines today the tools for accurately quantifying and analyzing the social dynamics of collaborative systems of this kind in the future.

* The meteoric rise of Bitcoin: The Bitcoin success story began in 2008 with an article about an alternative, digital currency published under the pseudonym Satoshi Nakamoto. In July 2010 Bitcoins were traded for the first time on the Internet exchange Mt. Gox at a rate of US$ 0.06 for 1 Bitcoin. The total value of all Bitcoins was US$ 277,000. By the end of 2013 the market value of all issued Bitcoins had climbed to more than US$ 14 billion whereby during spikes more than US$ 1,000 were paid for one Bitcoin. Today, over four million people use the digital currency. Bitcoins are traded in euros, dollars and in Chinese renminbi. Unlike conventional currencies there is no central bank for Bitcoins which has a monopoly for printing money. New Bitcoins are generated by what is known as mining via a global computer network – currently at a rate of 25 Bitcoins every ten minutes. Transactions are likewise verified and carried out on this network. Even the bankruptcy of important Bitcoin trading exchanges and negative headlines about money laundering and drug purchases on the Internet were not able to undermine confidence in the currency. A few days ago the PC giant Dell announced that it will henceforth accept Bitcoins as payment for products in its online shop.

Further reading

Garcia D, Tessone CJ, Mavrodiev P, Perony N. The digital traces of bubbles: feedback cycles between socio-economic signals in the Bitcoin economy. J. R. Soc. Interface. 2014 11 20140623; doi:10.1098/rsif.2014.0623 (published 6 August 2014)

Is There a Connection Between Fibromyalgia and Diabetes?

Did you realize that fibromyalgia and diabetes occur almost four times more often than you’d ever dream of expecting? Research has shown that keeping a tight rein on the blood sugar levels of diabetic patients actually greatly reduces the risk of the individual developing fibromyalgia at some point in their lives.

A 2003 study done in the journal Rheumatology International, showed that around 15 to 18 percent of diabetic patients also have fibromyalgia. This suggests that there is a link between the two.

The Connection between Fibromyalgia and Diabetes

The interesting thing about the study done in Rheumatology International was that when it came to a connection between the two, the connection was much stronger in those individuals who had type 1 diabetes than for those that had type 2 diabetes- though the association between fibromyalgia and diabetes was much higher in those with type 2.

The reason this association is so interesting is because type 1 diabetes is considered to occur due to an autoimmune disease, though the trigger is not exactly known. Due to the autoimmunity related to type 1 diabetes and the fact that it is strongly associated with fibromyalgia- around 4 times of the population- suggests that those who support that autoimmunity causes fibromyalgia could be headed in the right direction. Of course, before this is officially established, more research must be done.

Another very interesting connection between diabetes and fibromyalgia is that controlling blood sugar levels is directly connected with the likelihood of developing fibromyalgia. The higher a diabetic patient’s hemoglobin A1C levels are, the higher their likelihood of suffering from fibromyalgia. The hemoglobin A1C levels are the measure of how well the blood sugar levels are being controlled.

Also interesting to note is that the high blood sugar levels are associated with an increase in the severity of fibromyalgia symptoms in those individuals with both disorders. The symptoms that are substantially increased in those who have poor control of their blood sugar include: fatigue, headaches, the number of tender points, and disturbances in sleep.

Fibromyalgia and Diabetes

Diagnosis of Fibromyalgia and Diabetes

Reaching a diagnosis of fibromyalgia in those patients who already are dealing with diabetes can be very difficult. This is because diabetes can actually mimic any or all of the symptoms that are associated directly with fibromyalgia. Due to the fact that there is no test that can truly demonstrate the presence of fibromyalgia, the only way to truly reach a diagnosis is through true clinical testing of the symptoms.

Those who are suffering from diabetes- especially female patients- must be aware that fibromyalgia actually occurs at a rate of four times more in those who have diabetes than is seen in the general public. Because of this, those symptoms that are connected with diabetes but don’t get better when blood sugar levels are under control, should be looked at more closely because they could point to fibromyalgia.

One of the most common sources of confusion between the symptoms in diabetes and fibromyalgia is that the sensory symptoms, such as shooting nerve pain, tenderness, and general pain, is typically experienced in both conditions. However, a study conducted in 2011 in Germany showed that the distribution of the symptoms was actually different. This study took a look at the following seven symptoms:

  • Prickling Pain
  • Burning Pain
  • Numbness
  • Waxing/Waning pain (known as attacks)
  • Pain to normal, everyday stimuli
  • Pressure Points
  • Pain to Hot Object (known as thermal)

This study showed that the way the symptoms were grouped was related directly to the disease that was responsible for them. Following is what was discovered for these combinations of symptoms.

  • Attacks and pressure points were three times more likely to be caused by fibromyalgia than diabetes.
  • Thermal and pressure points were twice as likely to be caused by fibromyalgia as diabetes.
  • Prickling and numbness were three times more likely to be caused by diabetes instead of fibromyalgia.
  • Attacks and numbness were twice as likely to be caused by diabetes as fibromyalgia.
  • Burning, pressure, and attacks showed a slight difference in cause, favoring fibromyalgia as the cause over diabetes.

The results of this study suggested that while it is true that specific symptoms can be attributed to either of the disorders, taking an inventory of all of the symptoms could help to determine what the exact cause is based on how these are grouped together, as well as the timing and severity of them. Attacks of pain that are occurring at the same time that your pressure points are most sensitive are three times more likely to be caused by fibromyalgia than the attacks of pain that are occurring at the same time that numbness is.

Treatment of Fibromyalgia and Diabetes

The separate treatments for fibromyalgia and diabetes are quite different. However, treating one can (and often does) improve the manifesting symptoms of the other one. This is especially true where controlling blood sugar levels has a major impact on the fibromyalgia symptoms- managing your diabetes correctly actually reduces pain levels, the amount of disturbances in sleep, and fatigue associated with fibromyalgia.

Additionally, as it is with most other conditions, exercise is a great way to reduce the effects and symptoms of both fibromyalgia and diabetes. Exercise actually reduces the necessity of insulin and keeps you from reaching the extremes of blood sugar levels. It also serves to ward off heart disease and stroke that are common among the population suffering from diabetes. When it comes to fibromyalgia, exercise serves to reduce pain, increase sleep, and results in improvements in mood.

Aerobic exercise actually has many different benefits in regard to many different diseases, that not taking part in it actually means that you miss a very large and very important piece of any regimen of treatment. Experts recommend that you do about 15 to 20 minutes of light aerobic activity for a minimum of three times per week.

Both fibromyalgia and diabetes actually seem to be related more so than other diseases/disorders. It has been proven that fibromyalgia occurs around four times more often in those who have diabetes than it does in the general population. Researchers are working diligently to uncover this link. In the meantime, individuals suffering from both conditions should remember that treating both of them together is the best way to ensure maximum recovery.

How Vitamin D Can Help Fibromyalgia

Five million people in the United States of America alone have fibromyalgia, and if you are one of those people, then you know what it’s like to live with widespread pain, fatigue, and sometimes even depression.

Today, the cause of fibromyalgia is still unknown, but medical professionals and researchers suspect that it could be due to trauma, an abnormal pain response, or a virus.  While it will definitely take time to discover what the cause of fibromyalgia is, for now, it is highly likely that a lack of Vitamin D and Fibromyalgia have much in common.

Vitamin D Deficiency and Fibromyalgia

We should probably clear up something to begin with: there is currently no evidence to suggest that having a lack of sunlight exposure is related to developing the symptoms of fibromyalgia.  But we do know that some people who have fibromyalgia also have a Vitamin D deficiency.  So what exactly is the link between the two?

Numerous studies have been conducted to reveal that nearly half of all people with fibromyalgia also have a lack of Vitamin D.  It is very possible that the low levels of Vitamin D are simply due to other factors such as the condition of having fibromyalgia.  So the evidence that a lack of Vitamin D is the, or one of the, causes of fibromyalgia is near to nothing.

How Does Vitamin D Help Fibromyalgia?

Studies have also shown that getting enough Vitamin D into your system might at least lower the pain and symptoms of fibromyalgia.  The significant part of fibromyalgia that Vitamin D can help is inflammation.  In order for that to work, your Vitamin D levels should be help at 40 ng/ml.

Vitamin D has actually been used as a treatment for fibromyalgia multiple times.  For example, in the United States, many people who have fibromyalgia also had low levels of Vitamin D, and many of them took various approved drugs and treatments to cope with the pain.

But those who got more Vitamin d into their system showed the greatest improvement; in fact, many people who took approved drugs designed to treat fibromyalgia actually saw little to no improvement.

Those who took plenty of Vitamin D also never saw their symptoms become worse, but the high intake of Vitamin D still proved to not be an all out cure for fibromyalgia, as those who did take the Vitamin D still didn’t see a gradual improvement in their lives of dealing with fibromyalgia. The pain is still there, and so at best, it is probably the most accurate to assume that getting plenty of Vitamin D won’t make your symptoms or pain become any worse.  So overall, if you have fibromyalgia, it’s a good idea to get plenty of Vitamin D as at least part of your overall treatment plan.

Vitamin D Fibromyalgia

Vitamin D Supplements

Getting plenty of Vitamin D certainly won’t make your fibromyalgia any worse, and if anything, it should stop your pain and symptoms from increasing in intensity and help your inflammation.  One option to get plenty of Vitamin D into you is to take Vitamin D supplements.

There is no official cure for fibromyalgia, and all of the ‘cures’ out there might only make your problems worse.  So you can’t think of Vitamin D supplements as a cure for fibromyalgia, but you can think of them as part of your overall treatment plan, and an alternative to heading outside each day to get plenty of sunlight.

There has been research conducted to discover if Vitamin D supplements are a viable alternative to getting Vitamin D from sunshine, using a randomized test with thirty people diagnosed with fibromyalgia (and who also had Vitamin D deficiency).  The fibromyalgia patients all took supplements for six months, and then compared their findings to the patients who did not take Vitamin D supplements.

The results showed that those who took the Vitamin d supplements reported a lessening of their pain and fatigue vs. those who did not take the supplements.  This supports the theory that Vitamin D is a safe and un-harmful treatment for Fibromyalgia, and while it shouldn’t be your only course of action, it should certainly be part of your overall plan.

Taking Vitamin D supplements would be especially critical in the fall and winter time, as the sun will not be out as often, and even if you do go and spend time outside, you still might not get enough of the necessary sunlight.  You’ll want to consult with your doctor or medical professional to see how much Vitamin D supplements you should take, and then whether or not it will need to be monitored so you can tell if you are getting enough of it.

Conclusion

All in all, a lack of Vitamin D probably isn’t one of the causes of fibromyalgia.  But getting plenty of Vitamin D is important to improve your pain and symptoms of fibromyalgia, and if you can’t get enough Vitamin D via sunlight, then you’ll want to consider Vitamin D supplements as an alternative option.

If you have fibromyalgia, one of the first things that you’ll want to do is get your Vitamin D levels checked out.  If they are low, you need to either get plenty of sunlight or see what Vitamin D supplements would work the best for you.  But even if your Vitamin D levels currently are stable, you’ll still want to monitor and watch them closely, and maybe even keep some supplements on hand for if or when you need them.

If you can do that, then you should find your pain and symptoms to be less than what they could have been.  But as we have mentioned many times throughout this article, you will want to include Vitamin D as only a part of your overall treatment plan.  You can’t expect getting plenty of Vitamin D to be your only course of action for alleviating your pain, so have other treatment plans and maybe other drugs and vitamins to take too.

New Study Probes The Surprising Role Of Parents In Distracted Driving Among Teens

redOrbit Staff & Wire Reports – Your Universe Online

Parents who preach about the dangers of distracted driving are often on the other end of the phone when teenagers talk while behind the wheel, according to research presented Friday at the annual convention of the American Psychological Association (APA).

As part of the study, Petaluma, California-based cognitive psychologist Dr. Noelle LaVoie and her colleagues surveyed and interviewed over 400 drivers between the ages of 15 and 18 from 31 states and asked why they continued to use cellphones while driving despite warnings about the hazards such activity presents.

According to Sharon Jayson of USA Today, the study authors found that more than half of teenagers who said they had conducted a cellphone conversation while driving (53 percent) were talking with either their mom or dad at the time. In contrast, only 46 percent were chatting with a friend, though the opposite was true with texting.

“Teens said parents expect to be able to reach them, that parents get mad if they don’t answer their phone and they have to tell parents where they are,” Dr. LaVoie said in a statement. “Parents need to understand that this is not safe and emphasize to their children that it’s not normal or acceptable behavior. Ask the question, ‘Are you driving?’ If they are, tell them to call you back or to find a spot to pull over so they can talk.”

The researchers conducted in-person interviews with 13 teens who were between the ages of 15 and 17 and has either a learner’s permit or driver’s license, and asked them about various driving hazards, including texting or talking on the phone while driving. Every teen that said that he or she had talked on the phone while operating a motor vehicle said they had been talking to parents, while just 20 percent said that they had talked to friends.

They also had 395 people complete surveys, with 37 percent of those with permits and 50 percent of 18-year-olds with unrestricted driver’s licenses admitted that they had taken a phone call from mom or dad while driving. Sixteen percent of 18-year-olds and eight percent of the 15- or 17-year-olds said that they had texted a parent while driving, though the researchers said that teens were more likely to send messages to their friends than their folks.

“It was just very surprising to see how directly parents are involved. What we do know for sure is if parents would not call their teens while they’re (kids) driving, it would reduce teen distracted driving,” LaVoie told Jayson, noting that one of the things the teens interviewed discussed was that their parents “used their cell phone while driving.”

“A lot of parents aren’t really aware of how important it is to be a good role model and how dangerous it is for their teen to answer a cellphone while driving,” the study author added in an interview with Maureen Salamon of HealthDay. “There is certainly [prior research] showing that parents might not be modeling the best behavior for teens, and we know a lot of parents talk on the phone while driving. But this was a real shock.”

US Centers for Disease Control and Prevention (CDC) statistics report that approximately 2,700 teens aged 16 to 19 are killed each year, and another 280,000 are treated and released from emergency departments following automobile crashes, Salamon said. Furthermore, distracted driving is responsible for 11 percent of all vehicular fatalities among teens, and 21 percent of those crashes involved cellphones, a 2013 US National Highway Traffic Safety Administration (NHTSA) report said.

Start driving hands-free today – IOGEAR Solar Bluetooth Hands-Free Car Kit GBHFK231 (Black)

New IBM-Developed Processor Functions Like The Human Brain

redOrbit Staff & Wire Reports – Your Universe Online
IBM researchers have announced the development of a new computer chip that is inspired by the brain, mimicking the way that the mind can recognize patterns utilizing a web of interconnected transistors to simulate neural networks.
The processor is named TrueNorth, and according to John Markoff of the New York Times, it contains more than 5.4 billion transistors, yet requires no more power to function than a hearing aid (just 70 milliwatts of power versus the minimum of 35 watts required by current Intel processors, with have 1.4 billion transistors).
TrueNorth contains electronic “neurons” capable of signaling others when a specific type of data reaches a predetermined threshold, allowing them to work in unison to organize data into patterns, Markoff said. Using this infrastructure, the chip could ultimately be capable of calculations beyond the modern supercomputer, recognize when a person is performing a specific action, or controlling the activities of a robot.
Despite being no larger than a postage stamp, this neurosynaptic processor could also be used in self-driving vehicles and artificial intelligence systems installed on mobile devices, the AFP news agency explained. It is part of the company’s new approach to computer architecture design known as “cognitive computing.”
“We have taken inspiration from the cerebral cortex to design this chip,” IBM chief scientist for brain-inspired computing Dharmendra Modha told the news agency. He and colleagues from Cornell University and Cornell Tech detail their findings in the latest edition of the journal Science.
Modha explained that the computers we use today trace their lineage back to the “sequential number-crunching calculators” of the 1940s, with focus solely on mathematical or “left brain” processes. TrueNorth, on the other hand, attempts to mimic “right brain” functions of sensory processing by responding to visual, olfactory and other stimuli in order to learn how to respond to different situations, AFP added.
The project, which was funded by the US Defense Advanced Research Projects Agency (DARPA), could allow a chip to perform supercomputer-level calculations without needing to connect to the Internet to do so. This would allow autonomous cars to detect and solve potential accidents and other problems without needing to have a connection to Wi-Fi, and smartphones to interpret sights and smells in real time and without the need for a data connection.
“Though it is providing few details on timing, IBM says it is already talking to potential partners about ways to bring the chip to market,” said Wall Street Journal reporter Don Clark. “The company has connected multiple chips together to test potential system designs, and sees applications of the technology ranging from room-size supercomputers to floating jellyfish-shaped devices that could sense tsunamis or other aquatic conditions.”
Modha added that the company has “huge commercial ambitions” for the technology, but emphasized that the neurosynaptic chip is “not going to replace conventional computers. It is a complementary relationship.” While Intel, Qualcomm and others are conducting similar research, Cornell Tech electrical and computing engineer and project collaborator Rajit Manohar said that TrueNorth is “much closer to being usable” than their competitors’ processors.
Image 2 (below): A circuit board shows 16 of the new brain-inspired chips in a 4 X 4 array along with interface hardware. The board is being used to rapidly analyze high-resolutions images. Courtesy: IBM
—–
Shop Amazon – Hot New Releases – Updated Every Hour

Sleep Deprivation "Pervasive" Problem In Astronauts Before, During Missions

redOrbit Staff & Wire Reports – Your Universe Online
Astronauts tend to suffer from a significant amount of sleep deficiency in the weeks leading up to liftoff and throughout the duration of their missions, the authors of a 10-year research project being hailed as the largest-ever analysis of sleep habits before and during space flight report in Friday’s edition of The Lancet Neurology.
As part of the study, experts from the Brigham and Women’s Hospital (BWH) Division of Sleep and Circadian Disorders, the Harvard Medical School Division of Sleep Medicine, and the University of Colorado Sleep and Chronobiology Laboratory used data from 85 astronauts, 64 of whom had participated in a combined 80 space shuttle missions and 21 of whom had taken part in International Space Station (ISS) missions.
They recorded more than 4,000 nights worth of sleep on Earth and an additional 4,200 nights of slumber in space, using both objective and subjective evaluations of sleep quality. The astronauts wore a device known as an actigraph on their wrists, which tracks sleep and wake cycles, and also kept a daily diary recording their alertness levels and their own take on how well they had slept the previous evening.
While NASA, which helped fund the research, schedules 8.5 hours’ worth of sleep per night for crew members during spaceflight, the mean duration of sleep was actually less than six hours (5.96) on shuttle missions and only slightly more (6.09) on ISS missions. Only 12 percent of sleep episodes on shuttle missions and 24 percent on ISS missions lasted at least seven hours, compared to 42 percent and 50 percent respectively during post-flight nights.
The results also suggested the problems begin before the astronauts first leave the Earth’s atmosphere, as those polled averaged less than 6.5 hours of sleep per night during training (recorded roughly 12 weeks prior to spaceflight) – approximately 30 minutes per night less than the average American adult, according to the study authors.
“In ground-based studies, we know that sleeping less than six hours is associated with performance detriments,” lead author Dr. Laura K. Barger, an associate physiologist in the BWH Division of Sleep and Circadian Disorders, told Kim Painter of USA Today.
“Sleep deficiency is pervasive among crew members,” she added in a recent statement. “It’s clear that more effective measures are needed to promote adequate sleep in crew members, both during training and space flight, as sleep deficiency has been associated with performance decrements in numerous laboratory and field-based studies.”
According to AFP reporter Richard Ingham, the study also found that three-fourths of astronauts turned to sleep aids such as zolpidem (sold under the brand names Stilnox and Ambien) and zaleplon (sold under the brand names Sonata and Andante) in order to get some extra shut-eye while in space. In fact, medications were used on more than half of the nights, and in four out of 13 shuttle missions, all of the crew members took sleeping pills on the same night six percent of the time.
“The ability for a crew member to optimally perform if awakened from sleep by an emergency alarm may be jeopardized by the use of sleep-promoting pharmaceuticals,” said Barger. “Routine use of such medications by crew members operating spacecraft are of particular concern, given the US Federal Drug Administration (FDA) warning that patients using sleeping pills should be cautioned against engaging in hazardous occupations requiring complete mental alertness or motor coordination.”
That FDA warning, she noted, includes “potential impairment of performance of such activities that may occur the day following ingestion of sedative/hypnotics. This consideration is especially important because all crew members on a given mission may be under the influence of a sleep promoting medication at the same time.”
Senior author Dr. Charles Czeisler, chief of the BWH Division of Sleep and Circadian Disorders, added that future space missions “will require development of more effective countermeasures to promote sleep during spaceflight in order to optimize human performance. These measures may include scheduling modifications, strategically timed exposure to specific wavelengths of light, and behavioral strategies to ensure adequate sleep, which is essential for maintaining health, performance and safety.”
> NASA Interactive: How Astronauts are Affected by Space Exploration

The Connection between Fibromyalgia and Fever

Fibromyalgia is a chronic disease that has many different characteristics such as pain in the spine, hips, shoulders, neck, and yes- even a low grade fever. Fibromyalgia can occur in all individuals, regardless of age or gender, but does occur more often in women between the ages of 30 and 59. The exact cause of fibromyalgia is not known, which has made it very difficult to figure out a cure for it. Another thing that complicates this is the fact that some fibromyalgia symptoms are also present in other diseases/disorders.

Fibromyalgia and fever are actually related because an individual who is suffering from fibromyalgia can actually have a low grade fever at any given time. There isn’t really any specific evidence that connects the two but there is that possibility that fibromyalgia can cause a fever.

Individuals who have fibromyalgia also have a very weak immune system and infections can be easily picked up while symptoms are flared up. These infections can actually lead to a much higher fever. Additionally, the severe muscle pain/tingling that is common in individuals with fibromyalgia can cause fever. Your skin will get very hot and you may or may not start sweating. Additionally, though you may feel hot and think you have a fever, the thermometer will read normal. So, you could feel hot- but not really have a fever.

Fever related to fibromyalgia symptoms can also cause your glands to swell up. This happens in both individuals with fibromyalgia and those who are only experiencing a fever. Additionally, joint pain could flare up if you’re experiencing a mild fever.

However, typically the fevers are very mild and is not the main reason for discomfort in individuals with fibromyalgia. In fact, most individuals don’t even realize that they do have a fever, but are more focused on their symptoms that are causing them the most extreme discomfort.

In some cases, individuals with fibromyalgia could start shivering before, during, or after experiencing a fever. There isn’t really any known medication that can curb a fever with fibromyalgia, so the individual will have to continue with their normal course of treatment, despite having a low grade (or high grade) fever.

The only way to avoid having a fever with fibromyalgia symptoms is by taking care to reduce the other symptoms of your fibromyalgia. Make sure to take part in regular physical exercise in order to reduce muscle stiffness. Additionally, make sure that you are consuming a proper diet that will give you plenty of energy. You will also want to make sure that you get adequate sleep, which will help you to be relaxed and help to reduce the symptoms of fatigue and such.

Fibromyalgia and Fever

Consuming a balanced diet- as in consuming lots of fruits and vegetables especially- can help to reduce the symptoms of fibromyalgia and fever. Vegetables have lots of fiber, which is essential for increasing metabolism and fruits have vitamins and minerals that help with repair and rebuilding of the muscles. As an individual with fibromyalgia, you should take care to avoid caffeine, processed foods, soft drinks, refined carbohydrates, and other junk food.

For those individuals who are suffering from not sleeping and high levels of stress, medications that facilitate sleep and relaxation methods can be wonderful in reducing those symptoms of fibromyalgia, which can contribute to feeling feverish.

Both not getting adequate sleep and levels of stress are related because people who are stressed out don’t get the proper amount of sleep. The less sleep you’re able to get, the more likely you are to react negatively to stress. Therefore, relaxation methods, such as massage will help you to relax, which will help you to sleep much better.

Additionally, massage can help serve to decrease pain, which helps to alleviate other symptoms. In times of extreme lack of sleep, you can use sleep medication, but be sure that you don’t become dependent upon them entirely as a way to get some sleep. Also, keep in mind that sleeping pills do come with some undesirable side effects.

In order to relive the fever that results from the pain and other symptoms of fibromyalgia, you could consider acupuncture as an option. This will typically relive the pain in the tender spots and may very well serve to reduce stress. Both massage and physical therapy actually regulates the neurotransmitters in the brain, which help your body to actually have a much higher pain threshold. This also leads to the individual to be able to more effectively overcome stress and pain.

Individuals who have fibromyalgia are prone to experiencing both spells of fever and spells of chills on occasion. In some cases, the individuals will have a fever the entire time that they’re suffering from fibromyalgia, most will actually never once experience a fever. Physicians say that individuals who are in the more advanced stages of fibromyalgia are less affected by fever and chills, and those in the early stages are more likely to be affected.

In conclusion, fibromyalgia is a chronic disease that has many different characteristics such as pain in the spine, hips, shoulders, neck, and yes- even a low grade fever. Fibromyalgia can occur in all individuals, regardless of age or gender, but does occur more often in women between the ages of 30 and 59. The exact cause of fibromyalgia is not known, which has made it very difficult to figure out a cure for it. Another thing that complicates this is the fact that some fibromyalgia symptoms are also present in other diseases/disorders.

Fibromyalgia and fever are actually related because an individual who is suffering from fibromyalgia can actually have a low grade fever at any given time. There isn’t really any specific evidence that connects the two but there is that possibility that fibromyalgia can cause a fever.

Physicists Eye Neural Fly Data, Find Formula For Zipf’s Law

Emory University

Physicists have identified a mechanism that may help explain Zipf’s law – a unique pattern of behavior found in disparate systems, including complex biological ones. The journal Physical Review Letters is publishing their mathematical models, which demonstrate how Zipf’s law naturally arises when a sufficient number of units react to a hidden variable in a system.

“We’ve discovered a method that produces Zipf’s law without fine-tuning and with very few assumptions,” says Ilya Nemenman, a biophysicist at Emory University and one of the authors of the research.

The paper’s co-authors include biophysicists David Schwab of Princeton and Pankaj Mehta of Boston University. “I don’t think any one of us would have made this insight alone,” Nemenman says. “We were trying to solve an unrelated problem when we hit upon it. It was serendipity and the combination of all our varied experience and knowledge.”

Their findings, verified with neural data of blowflies reacting to changes in visual signals, may have universal applications. “It’s a simple mechanism,” Nemenman says. “If a system has some hidden variable, and many units, such as 40 or 50 neurons, are adapted and responding to the variable, then Zipf’s law will kick in.”

That insight could aid in the understanding of how biological systems process stimuli. For instance, in order to pinpoint a malfunction in neural activity, it would be useful to know what data recorded from a normally functioning brain would be expected to look like. “If you observed a deviation from the Zipf’s law mechanism that we’ve identified, that would likely be a good place to investigate,” Nemenman says.

Zipf’s law is a mysterious mathematical principle that was noticed as far back as the 19th century, but was named for 20th-century linguist George Zipf. He found that if you rank words in a language in order of their popularity, a strange pattern emerges: The most popular word is used twice as often as the second most popular, and three times as much as the third-ranked word, and so on. This same rank vs. frequency rule was also found to apply to many other social systems, including income distribution among individuals and the size of cities, with a few exceptions.

More recently, laboratory experiments suggest that Zipf’s power-law structure also applies to a range of natural systems, from the protein sequences of immune receptors in cells to the intensity of solar flares from the sun.

“It’s interesting when you see the same phenomenon in systems that are so diverse. It makes you wonder,” Nemenman says.

Scientists have pondered the mystery of Zipf’s law for decades. Some studies have managed to reveal how a feature of a particular system makes it Zipfian, while others have come up with broad mechanisms that generate similar power laws but need some fine-tuning to generate the exact Zipf’s law.

“Our method is the only one that I know of that covers both of these areas,” Nemenman says. “It’s broad enough to cover many different systems and you don’t have to fine tune it: It doesn’t require you to set some parameters at exactly the right value.”

The blowfly data came from experiments led by biophysicist Rob de Ruyter that Nemenman worked on as a graduate student. Flies were turned on a rotor as they watched the world go by, hundreds of times. The moving scenes that the flies repeatedly experienced simulated their natural flight patterns. The researchers recorded when neurons associated with vision spiked, or fired. All sets of the data largely matched within a few hundred microseconds, showing that the flies’ neurons were not randomly spiking, but instead operating like precise coding machines.

If you think of a neuron firing as a “1” and a neuron not firing as a “0,” then the neural activity can be thought of as words, made up of 1s and 0s. When these “words,” or units, are strung together over time, they become “sentences.”

The neurons are turning visual stimuli into units of information, Nemenman explains. “The data is a way for us to read the sentences the fly’s vision neurons are conveying to the rest of the brain.”

Nemenman and his co-authors took a fresh look at this fly data for the new paper in Physical Review Letters. “We were trying to understand if there is a relationship between ideas of universality, or criticality, in physical systems and neural examples of how animals learn,” he says.

In order to navigate in flight, the flies’ visual neurons adapt to changes in the visual signal, such as velocity. When the world moves faster in front of a fly, these sensitive neurons adapt and rescale. These adaptions enable the flies to adjust to new environments, just as our own eyes adapt and rescale when we move from a darkened theater to a brightly lit room.

“We showed mathematically that the system becomes Zipfian when you’re recording the activity of many units, such as neurons, and all of the units are responding to the same variable,” Nemenman says. “The fact that Zipf’s law will occur in a system with just 40 or 50 such units shows that biological units are in some sense special – they must be adapted to the outside world.”

The researchers provide mathematical simulations to back up their theory. “Not only can we predict that Zipf’s law is going to emerge in any system which consists of many units responding to variable outside signals,” Nemenman says, “we can also tell you how many units you need to develop Zipf’s law, given how variable the response is of a single unit.”

They are now researching whether they can bring their work full circle, by showing that the mechanism they identified applies to Zipf’s law in language.

“Letters and words in language are sequences that encode a description of something that is changing over time, like the plot line in a story,” Nemenman says. “I expect to find a pattern similar to how vision neurons fire as a fly moves through the world and the scenery changes.”

Congratulations! You Lost The Weight! But Why Are You Still Unhappy?

April Flowers for redOrbit.com – Your Universe Online
We are inundated daily with TV, print and radio commercials that tell us losing weight will make us healthier and happier. All one needs to do to have a more fulfilling, satisfying, happy life is lose weight. But are they right? A new study, published in PLOS ONE, shows that while losing weight will make you healthier, it might not make you happier.
The study examined 1,979 overweight and obese adults in the UK, finding that people who lost five percent or more of their initial body weight over a four year time period showed significant gains in physical health markers. However, these same people were more likely to report depressed moods than those who stayed within five percent of their original weight.
The researchers suggest that their findings highlight the need for clinicians to consider mental health as well as physical when patients are losing weight. In the past, clinical trials have been shown to improve weight loss participants’ mood, however, the researchers indicate that this could be more a result of the supportive environment than the weight loss itself. The mood improvement effects are seen early in such trials and are not related to the extent of the weight loss.
Weight loss does not necessarily cause depression, the authors caution, as depression and weight loss might share a common cause. The findings do show, however, that weight loss outside of clinical trial settings cannot be assumed to improve mood. They also raise questions concerning the psychological impact of weight loss.
The researchers collected their data as part of the English Longitudinal Study of Ageing (ELSA) – a study of adults 50 years or older. They excluded patients who had a clinical diagnosis of depression or a debilitating illness. Standard questionnaires were used to assess mood, and weight loss was measured by trained nurses.
Of the original cohort, 278 (14 percent) lost at least five percent of their starting weight with an average loss of 15 pounds per person. The team adjusted the findings for serious health issues and major life events such as bereavement, which can cause weight loss and depressed mood. Before the adjustment, the people who lost weight were 78 percent more likely to report being depressed; after the adjustment, the odds of reporting depression remained high at 52 percent.
“We do not want to discourage anyone from trying to lose weight, which has tremendous physical benefits, but people should not expect weight loss to instantly improve all aspects of life,” said Dr. Sarah Jackson of University College London’s (UCL) Epidemiology & Public Health in a recent statement. “Aspirational advertising by diet brands may give people unrealistic expectations about weight loss. They often promise instant life improvements, which may not be borne out in reality for many people. People should be realistic about weight loss and be prepared for the challenges.”
“Resisting the ever-present temptations of unhealthy food in modern society takes a mental toll, as it requires considerable willpower and may involve missing out on some enjoyable activities. Anyone who has ever been on a diet would understand how this could affect wellbeing. However, mood may improve once target weight is reached and the focus is on weight maintenance. Our data only covered a four year period so it would be interesting to see how mood changes once people settle into their lower weight,” she continued.
“Healthcare professionals should monitor patients’ mental as well as physical health when recommending or responding to weight loss, and offer support where necessary. People who are trying to lose weight should be aware of the challenges and not be afraid to seek support, whether from friends, family or healthcare professionals.”
Professor Jane Wardle, director of the Cancer Research UK Health Behavior Center at UCL, commented, “A recent UK survey found that 60 percent of overweight and obese adults in the UK are trying to lose weight. There are clear benefits in terms of physical health, which our study confirmed. People who lost weight achieved a reduction in blood pressure and serum triglycerides; significantly reducing the risk of heart disease. However, patients and doctors alike should be aware that there is no immediate psychological benefit and there may be an increased risk of depression.”
GET FIT WITH FITBIT – Fitbit Flex Wireless Activity + Sleep Wristband, Black

Researchers Adapt Origami Techniques To Create Self-Assembling Robots And Materials

redOrbit Staff & Wire Reports – Your Universe Online
Drawing inspiration from origami, the traditional Japanese art of folding paper into three-dimensional objects, researchers from MIT and Harvard University have created a way to coax flat sheets of composite materials to transform themselves into complex robots capable of performing tasks such as crawling and turning.
Writing in the latest edition of the journal Science, the study authors explained how their robot assembled itself from flat sheets of paper and shape memory polymers which have electronics embedded within them.
The composite was able to fold itself into a dynamic and functional machine in approximately four minutes, they added. Afterwards, it crawled away at a speed of more than two inches per second and was able to turn without any human help, making it the first self-folding machine capable of doing so without additional outside assistance.
“We demonstrated this process by building a robot that folds itself and walks away without human assistance,” lead author Sam Felton, a Ph.D. candidate at Harvard University’s School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering, explained in a statement Thursday.
“Folding allows you to avoid the ‘nuts and bolts’ assembly approaches typically used for robots or other complex electromechanical devices and it allows you to integrate components like electronics, sensors, and actuators while flat,” added senior author Rob Wood, the Charles River Professor of Engineering and Applied Sciences Core Faculty Member at Harvard University’s Wyss Institute for Biologically Inspired Engineering.
The new robot is similar to a machine described by Wood and colleagues from Harvard and MIT this spring at the 2014 IEEE International Conference on Robotics and Automation. That robot self-assembled from laser-cut materials when uniformly heated, but the new unit relies on electrical leads instead of a hot plate or oven to deliver heat to the robot’s joints, thus initiating the folding process.
Erik Demaine, an MIT professor of computer science and engineering, and a member of both research teams, called the development of the new robot “exciting from a geometry standpoint, because it lets us fold more things. Because we can do the sequencing, we have a lot more control. And it lets us make active folding structures. Instead of just self-assembly, you can then make it walk.”
According to the researchers, the robot is built from five layers of minerals, each of which is cut to digital specifications using a laser cutter. A middle layer of copper, etched into a network of electrical leads, is placed between two structural layers of paper. The outer layers are made from a special polymer that folds when heated, and once they are all assembled, a microprocessor and at least one small motor are attached to the top surface.
The prototype was assembled manually, but the study authors explain that it could also be put together using a robotic “pick and place” system. The study describes a design that uses two motors, with each one controlling two of the robot’s legs and their activities being synchronized by the microprocessor. In addition, each leg has eight mechanical “linkages,” the dynamics of which convert the force exerted by the motor into movement.
“Getting a robot to assemble itself autonomously and actually perform a function has been a milestone we’ve been chasing for many years,” said Wood, but it isn’t the only new technological breakthrough based on the principals of origami. In fact, a new paper details how a special type of origami fold called Miura-ori could be used to create reprogrammable molecular-scale machines.

The research is the inspiration of University of Massachusetts Amherst physicist Christian Santangelo, who along with physicists and materials scientists from Cornell University and Western New England University, turned to the paper-folding technique for “tuning” the physical properties of thin sheets of materials. Their research could ultimately lead to the development of molecular-scale machines that could snap into place and perform mechanical tasks.
Santangelo first brought up Miura-ori as a potential way to design controllable new materials during a physics meeting several years ago, the university said in a statement. Also known as tessellation, this special type of folding occurs naturally in some types of leaves and tissues. It arranges a flat surface using a repeated, alternating pattern of mountain-and-valley zigzag folds, allowing them to contract like an accordion when they are squeezed.
“As you compress most materials along one axis, they expand in other directions,” explained Santangelo. “A rare class of materials, however, does the opposite. If you compress them along one direction, they collapse uniformly in all directions. Miura-ori shows us how to use this property to make new devices. Exotic materials can be formed from traditional materials simply by altering microscopic structure.”
“We’re looking at an origami structure and using a language developed for understanding the mechanical properties of atomic crystals to talk about what we see here,” added fellow investigator and Cornell University graduate student Jesse Silverberg. “Our work brings together origami, metamaterials, programmable matter, crystallography and more. It’s totally bizarre and unique to have so many of these ideas intersecting at the same time.”
The researchers ultimately hope that their work will lead to the creation of atomic-scale machines programmed based on folding patterns and are capable of snapping into place and performing mechanical functions.
“You can imagine a folded sheet of some material and popping in defects to make a stiff shield, or somehow deploying an object and giving it a rigid backbone,” said Cornell associate professor of physics Itai Cohen. “Think of it as appendages that can be locked in place or a useful tool whose properties can be set once it has been deployed. In that way, it’s kind of like the transformers, where robots fold themselves up but unfurl, locked, into human form.”
Image 2 (below): Jesse Silverberg et al. used origami-based engineering to design a lightweight, ultra-tough material with tunable properties. Their inspiration was a specific type of zigzag origami fold that has previously been used to efficiently pack solar panels for space missions. Credit: Jesse Silverberg, Arthur Evans, Lauren McLeod, Ryan Hayward, Thomas Hull, Christian Santangelo, Itai Cohen
—–
Shop Amazon – Wearable Technology: Electronics

Google To Start Giving Search Ranking Boost To Encrypted Websites

redOrbit Staff & Wire Reports – Your Universe Online
In an attempt to help protect Web surfers stay away from hackers, Google has adopted a new algorithm that will reward encrypted websites by placing them higher in their search results, various media outlets reported Thursday.
According to Rolfe Winkler of the Wall Street Journal, the search engine is looking to reward websites that are more secure by giving “bonus points” to encrypted pages in its ranking algorithm. The company hopes the move will encourage website developers to utilize technology which helps protect their websites and the personal data of their users.
“Over the past few months we’ve been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms,” webmaster trends analysts Zineb Ait Bahajji and Gary Illyes explained in an August 6 blog post. “We’ve seen positive results, so we’re starting to use HTTPS as a ranking signal.”
“For now it’s only a very lightweight signal – affecting fewer than one percent of global queries, and carrying less weight than other signals such as high-quality content – while we give webmasters time to switch to HTTPS,” they added. “But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.”
VentureBeat’s Ruth Reader said that the move was significant, and predicted that it would encourage websites concerned with their search rankings to improve their encryption and security protocols.
Likewise, American Civil Liberties Union (ACLU) principal technologist Christopher Soghoian told the Wall Street Journal that the move was “a huge deal,” calling it “the ultimate carrot for websites” to start using encryption.
“Developers compete fiercely with each other and tweak every small aspect of their websites to get a top search ranking. With Google making web encryption a factor in ranking, many would likely make their websites more secure for visitors,” explained Reuters reporter Supantha Mukherjee.
Winkler explained that encrypting data which is transmitted over the Internet creates an additional barrier keeping those attempting to snoop on or swipe the data of Web surfers or online shoppers. This extra layer of security can help keep user data safe, even on unsecured wireless networks like those found in airports or restaurants.
“The encrypted network often garners complaints of slower service—an issue Google believes it has addressed to a point where it no longer makes sense to allow HTTP connections, a company spokeswoman said earlier this year,” added Stephanie Mlot of PC Magazine, pointing out that the search engine defaults to HTTPS in its search results, Gmail and Drive.
In addition to announcing the algorithm changes, Google also revealed that it would be crafting a series of informational blog posts to help websites become more encryption-savvy and to prevent developers from falling victim to common mistakes.
Bahajji and Illyes also shared some basic tips to get webmasters started, telling them to determine whether or not they needed a single, multi-domain, or wildcard certificate and advising them to use 2048-bit key certificates. They also recommended using relative URLs for resources residing on the same secure domain and protocol relative URLs for all other domains, and allowing search engines to index pages whenever possible.
Shop Amazon – Hot New Releases – Updated Every Hour

Wikipedia Founder Vows To Fight The Censorship Of "Right To Be Forgotten" Regulations

redOrbit Staff & Wire Reports – Your Universe Online
Vowing to fight what they call the “censorship” of “right to be forgotten” laws, the organization that runs Wikipedia has started publicly posting the notices it receives when search engines intend to remove links from their results, various media outlets reported on Wednesday.
According to Alex Hern of The Guardian, Wikipedia founder Jimmy Wales revealed that Google had been asked to remove five links from the online encyclopedia over the past week. Those notices have been posted online by the Wikimedia Foundation, the non-profit organization which operates the website, he added.
The pages in question include an image of a guitar player, a page about former criminal Gerry Hutch, a page about the Italian gangster Renato Vallanzasca, and a page about an amateur chess player, Hern and BBC News reported. Speaking at a London press conference, Wales had harsh words for those who would use the “right to be forgotten” verdict to remove Wikipedia links.
“History is a human right and one of the worst things that a person can do is attempt to use force to silence another,” he said. “I’ve been in the public eye for quite some time. Some people say good things, some people say bad things… that’s history, and I would never use any kind of legal process like to try to suppress it.”
In all, the Wikimedia Foundation revealed that more than 50 links to its website had been affected by the requests, including an English-language page about Hutch, a Dublin-born businessman who was jailed in the 1980s; an Italian-language page about a group of criminals known as Banda della Comasina; and an Italian-language page about Vallanzasca, an Italian who was spent time in jail after being involved in kidnappings and bank robberies.
“We only know about these removals because the involved search engine company chose to send notices to the Wikimedia Foundation,” the organization’s lawyers wrote in a blog. “Search engines have no legal obligation to send such notices. Indeed, their ability to continue to do so may be in jeopardy.”
“Since search engines are not required to provide affected sites with notice, other search engines may have removed additional links from their results without our knowledge. This lack of transparent policies and procedures is only one of the many flaws in the European decision,” they added.
The revelations came as part of the Foundation’s first-ever transparency report, and according to The Huffington Post UK, the organization divided the requests into three different categories: user data, content and takedown, and copyright infringements.
Over the past two years, there were 56 requests for user data. Fifteen of those came from government sources, and information was produced on eight of those occasions, with a total of 11 user accounts being affected. In comparison, Google received more than 27,000 requests for user data from July 2012 and June 2013, producing information in over 17,000 of those cases, the media outlet noted.
The report also revealed that the Wikimedia Foundation received 304 requests for content to be altered or taken down from June 2012 through July 2014, with 32 of those requests originating in the UK and 105 from the US. None of those requests were granted. However, during that same time, 58 requests were made for takedowns relating to copyright (four from the UK and 31 from the US), with 41 percent of them being granted.
The European Court of Justice ruled in May that search engines such as Google had an obligation to remove links containing sensitive information if asked to do so, and that users who wanted those websites to remove personal data could file a request directly with the operator of the search engine.
Google received its first requests under the new ruling later on that month, and at a meeting with EU regulators in Brussels last month, the company revealed that it had been contacted by 91,000 individuals covering a total of 328,000 individual URLs, and that over half of those requests had already been processed.
In enacting the “right to be forgotten” laws, “the European court abandoned its responsibility to protect one of the most important and universal rights: the right to seek, receive, and impart information,” Wikimedia Foundation executive director Lila Tretikov said, according to Ars Technica. “As a consequence, accurate search results are vanishing in Europe with no public explanation, no real proof, no judicial review, and no appeals process. The result is an Internet riddled with memory holes – places where inconvenient information simply disappears.”
GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now

Should You Take Baby Aspirin To Prevent Heart Attacks?

University of Rochester Medical Center

A majority of middle-aged men and women eligible to take aspirin to prevent heart attack and stroke do not recall their doctors ever telling them to do so, according to a University of Rochester study of a national sample of more than 3,000 patients.

Published online by the Journal of General Internal Medicine, the finding illustrates a common disconnect between public health guidelines and what occurs in clinical practice. The UR study is consistent with other research showing that physicians often do not recommend aspirin as prevention therapy to the general population, despite established guidelines by the U.S. Preventative Services Task Force.

Several reasons might explain the reluctance, such as competing demands and limited time to properly assess a patient’s eligibility for aspirin, according to lead author Kevin A. Fiscella, M.D., M.P.H., professor of Family Medicine at the UR School of Medicine and Dentistry.

Uncertainty about the benefits of aspirin therapy versus potential harms like bleeding in the digestive tract, also hinder physicians’ decisions, the study said.

For the JGIM study, Fiscella’s group analyzed data from 3,439 patients included in the 2011-’12 National Health and Nutrition Examination Survey (NHANES). None of the patients had cardiovascular disease, but all qualified for aspirin therapy based on their 10-year risk score for factors such as diabetes, high blood pressure, obesity, smoking, and use of cholesterol-lowering medications.

Of the sample, 87 percent of men and 16 percent of women were eligible to take aspirin as a preventive measure. But when they were asked the question — “Doctors and other health care providers sometimes recommend that you take a low-dose aspirin each day to prevent heart attack, strokes, or cancer. Have you ever been told to do this?”  — a low rate of 34 percent of the men and 42 percent of the women said yes.

Co-author John Bisognano, M.D., Ph.D., director of outpatient cardiology services at UR Medicine, said most physicians can agree on approaches to medical care in immediately life-threatening situations, but have less enthusiasm to quickly embrace preventive guidelines, particularly when they involve wide-ranging interventions for a large segment of the population.

New studies that present conflicting data or re-interpret older data also complicate the issue and can be confusing for patients, he said. Despite the USPSTF guidelines for aspirin being published in 2009, for example, the FDA declined to approve the same recommendations as recently as last spring.

“Patients often view changes as an illustration that folks in the medical field can’t really make up their minds,” said Bisognano, professor of Medicine. “Changes can undermine a practitioner’s or patient’s enthusiasm to immediately endorse new guidelines because they wonder if it will change again in three years.”

But science and medical practice is fluid, he said, and the only way to move the field forward is to continually understand and look for ways to apply the new data and avoid assumptions of the past.

The study also noted that using expanded primary care teams of nurses, medical assistants, and health educators may help to reduce the volume of decisions that rest solely with the physician at the office visit. Sharing care can improve agreement between published guidelines, the use of risk models, and actual practice, the study said.

The authors report no conflicts, and no external funding sources.

Link Between Vitamin D Deficiency And Dementia Risk Confirmed

University of Exeter

Vitamin D deficiency is associated with a substantially increased risk of dementia and Alzheimer’s disease in older people, according to the most robust study of its kind ever conducted.

An international team, led by Dr David Llewellyn at the University of Exeter Medical School, found that study participants who were severely Vitamin D deficient were more than twice as likely to develop dementia and Alzheimer’s disease.

The team studied elderly Americans who took part in the Cardiovascular Health Study. They discovered that adults in the study who were moderately deficient in vitamin D had a 53 per cent increased risk of developing dementia of any kind, and the risk increased to 125 per cent in those who were severely deficient.

Similar results were recorded for Alzheimer’s disease, with the moderately deficient group 69 per cent more likely to develop this type of dementia, jumping to a 122 per cent increased risk for those severely deficient.

The study was part-funded by the Alzheimer’s Association, and is published in August 6 2014 online issue of Neurology, the medical journal of the American Academy of Neurology. It looked at 1,658 adults aged 65 and over, who were able to walk unaided and were free from dementia, cardiovascular disease and stroke at the start of the study. The participants were then followed for six years to investigate who went on to develop Alzheimer’s disease and other forms of dementia.

Dr Llewellyn said: “We expected to find an association between low Vitamin D levels and the risk of dementia and Alzheimer’s disease, but the results were surprising – we actually found that the association was twice as strong as we anticipated.

“Clinical trials are now needed to establish whether eating foods such as oily fish or taking vitamin D supplements can delay or even prevent the onset of Alzheimer’s disease and dementia. We need to be cautious at this early stage and our latest results do not demonstrate that low vitamin D levels cause dementia. That said, our findings are very encouraging, and even if a small number of people could benefit, this would have enormous public health implications given the devastating and costly nature of dementia.”

Research collaborators included experts from Angers University Hospital, Florida International University, Columbia University, the University of Washington, the University of Pittsburgh and the University of Michigan. The study was supported by the Alzheimer’s Association, the Mary Kinross Charitable Trust, the James Tudor Foundation, the Halpin Trust, the Age Related Diseases and Health Trust, the Norman Family Charitable Trust, and the National Institute for Health Research Collaboration for Leadership in Applied Research and Care South West Peninsula (NIHR PenCLAHRC).

Dementia is one of the greatest challenges of our time, with 44 million cases worldwide – a number expected to triple by 2050 as a result of rapid population ageing. A billion people worldwide are thought to have low vitamin D levels and many older adults may experience poorer health as a result.

The research is the first large study to investigate the relationship between vitamin D and dementia risk where the diagnosis was made by an expert multidisciplinary team, using a wide range of information including neuroimaging. Previous research established that people with low vitamin D levels are more likely to go on to experience cognitive problems, but this study confirms that this translates into a substantial increase in the risk of Alzheimer’s disease and dementia.

Vitamin D comes from three main sources – exposure of skin to sunlight, foods such as oily fish, and supplements. Older people’s skin can be less efficient at converting sunlight into Vitamin D, making them more likely to be deficient and reliant on other sources. In many countries the amount of UVB radiation in winter is too low to allow vitamin D production.

The study also found evidence that there is a threshold level of Vitamin D circulating in the bloodstream below which the risk of developing dementia and Alzheimer’s disease increases.  The team had previously hypothesized that this might lie in the region of 25-50 nmol/L, and their new findings confirm that vitamin D levels above 50 nmol/L are most strongly associated with good brain health.

Commenting on the study, Dr Doug Brown, Director of Research and Development at Alzheimer’s Society said: “Shedding light on risk factors for dementia is one of the most important tasks facing today’s health researchers. While earlier studies have suggested that a lack of the sunshine vitamin is linked to an increased risk of Alzheimer’s disease, this study found that people with very low vitamin D levels were more than twice as likely to develop any kind of dementia.

“During this hottest of summers, hitting the beach for just 15 minutes of sunshine is enough to boost your vitamin D levels. However, we’re not quite ready to say that sunlight or vitamin D supplements will reduce your risk of dementia. Large scale clinical trials are needed to determine whether increasing vitamin D levels in those with deficiencies can help prevent the dementia from developing.”

Largest Data Breach Ever: One Billion Passwords Swiped By Russian Hackers

redOrbit Staff & Wire Reports – Your Universe Online
Russian hackers have stolen more than one billion usernames and passwords belonging to over 500 million unique email addresses in what one cybersecurity firm is calling “arguably the largest data breach known to date.”
Hold Security, a US-based firm that specializes in detecting this type of activity, revealed on Tuesday that an unnamed Russian gang of hackers had amassed more than 4.5 billion records. Most of those records are stolen credentials, and 1.2 billion appeared to be unique.
“It is absolutely the largest breach we’ve ever encountered,” Alex Holden, the founder and chief information security officer of the Wisconsin company, told USA Today reporters Donna Leinwand Leger, Elizabeth Weise and Jessica Guynn. “We thought at first they were run-of-the-mill spammers, but they got very good at stealing these databases.”
Most disconcerting for Holden, the writers noted, was discovering his own personal information among the thieves’ cache. He added that Hold Security is attempting to contact the affected parties, which include members of the auto industry, real estate and oil companies, computer hardware and software firms and the food industry. However, Holden declined to identify any of the victims, stating that most of their websites are still vulnerable.
The group, which the company later dubbed “CyberVor” (with “vor” being the Russian word for “thief), reportedly cracked more than 420,000 web and FTP sites along the way, the security firm said. At first, they acquired databases of stolen credentials from other hackers via the black market, and used that information to attack e-mail providers, social media, and other websites, distributing spam and installing malicious redirections on legitimate systems.
Earlier this year, they changed their approach and used the black market data they obtained to gain access to data from botnet networks – a large group of computers that have been infected by a virus and are controlled by a single system. The botnets used those systems to identify SQL vulnerabilities on the sites they visited.
“Over 400,000 sites were identified to be potentially vulnerable to SQL injection flaws alone,” the company explained in its report. “The CyberVors used these vulnerabilities to steal data from these sites’ databases. To the best of our knowledge, they mostly focused on stealing credentials, eventually ending up with the largest cache of stolen personal information, totaling over 1.2 billion unique sets of e-mails and passwords.”
According to the Wall Street Journal’s Danny Yadron, Hold Securities is offering to perform “breach notification services” for website owners to determine if they had been affected – but at a cost. Holden told Yadron that convincing other businesses to pay a fee in order to recoup the costs of verifying website ownership and “prove to them that we are the ‘good guys’… is a hard and often thankless task.”
While Holden did not state exactly how much those services would cost, BBC News went on to report that Hold Securities later posted a message on its website stating that its “breach notification service” would cost interested parties $120 per month. Their approach to the issue surprised at least one computer security expert.
“This situation is quite unusual in that the company has decided to charge for this information,” Dr. Steven Murdoch of the University College London’s computer science department told BBC News. “Usually they would do an initial disclosure [of who had been affected] for free and then offer their services for a fee at a later stage.”
“The company rightly points out that there is going to be a huge amount of work to securely contact all the affected websites, but a common solution to this is to partner with a government or industry-funded organization to help with that,” he added.
Murdoch also cautioned that users should not be too quick to reset their passwords.
“Although there’s a large amount of passwords involved, a lot of them could be irrelevant and many of the websites tiny,” he said. “It’s not necessarily the case that a large proportion of internet users have been affected… So, there’s no reason to panic now, but perhaps it’s a good reminder to follow best practice of not using the same password on multiple websites, because this will not be the last time such a breach happens.”
PROTECT YOURSELF TODAY – Norton Antivirus

6,500 Year Old Human Skeleton Re-Discovered In Museum Storage Room

redOrbit Staff & Wire Reports – Your Universe Online
Anybody who has ever cleaned out a closet knows the feeling of finding some long-forgotten item, but scientists at the Penn Museum in Philadelphia have taken it to a next level – re-discovering a 6,500-year-old human skeleton that had been kept for decades in one of their own storage rooms.
The skeleton, which is complete, has been stored in a coffin-like box for 85 years, its identifying documentation having all but vanished, the museum said in a statement. However, an ongoing summer project to digitize old records from a world-famous excavation resulted in that documentation being rediscovered, allowing the staff to identify the history of the skeleton.
The skeleton was discovered by Sir Leonard Woolley’s joint Penn Museum/British Museum team during a 1929-1930 excavation in what is now southern Iraq. This makes it approximately 2,000 years older than the remains and materials discovered at the famed Mesopotamian “royal tombs,” and a visual examination revealed that it is of a male who had been well-muscled, was at least 50 years old, and was roughly 5’9” tall, according to the Associated Press (AP).
“Woolley’s records indicated that he had shipped a skeleton over, and the team digitizing his records had uncovered pictures of the excavation, which showed the skeleton being removed from its grave,” said Reuters reporter Daniel Kelley. The remains were found 40 feet below the ground at the Royal Cemetery of Ur dig site, buried beneath in a deep layer of silt believed to have been the result of a massive flood beneath the cemetery itself.
After the bones were recovered, Woolley’s team had the bones and the soil surrounding them coated in wax, then had it shipped first to London before it finally made it to Philadelphia and the Penn Museum. The re-discovery of the skeleton coupled with advanced research techniques not available when it was originally recovered could lead to new insights about the origins, diet, diseases, trauma and stress of a population about which little is known.
The records were discovered by William Hafford, one of the researchers on the digitization project, who then brought them to the attention of Dr. Janet Monge, the museum’ chief curator. Dr. Monge was aware of the unusual skeleton in the closet, which is one of an estimated 2,000 complete human skeletons in the museum’s collections, but until now there had been no information to explain where or when it had come from.
According to Kelley, the man to whom the remains belong has been named Noah by the museum, and he is a rare example of an intact skeleton dating as far back to 4500 BC. While the museum has other remains from ancient Ur, which is located approximately 10 miles from Nassiriya in southern Iraq, Noah is said to be around 2,000 years older than any remains previously discovered by the excavation at the cemetery, he added.
Oddly enough, AP writer Kathy Matheson reported that researchers at Bristol University in the UK discovered a box of materials from the same Ur expedition back in June. The box was found on top of a cupboard and contained pottery, seeds and other objects dating back some 4,500 years – but it remains a mystery how the materials found their way to Bristol, which had no connection to the Woolley expedition.
Shop Amazon – Rent eTextbooks – Save up to 80%

Verizon Defends Decision To Start Throttling Some Unlimited 4G LTE Customers

redOrbit Staff & Wire Reports – Your Universe Online
A top mobile service provider is defending the decision to limit its decision to slow data transfer speeds for some customers, stating that it would only happen in “very limited circumstances” and is also accusing the US Federal Communications Commission (FCC) of coming down hard on the company over a common industry practice.
According to CNET reporter Marguerite Reardon, Verizon Wireless CEO Daniel Mead said during a press conference on Monday that he was surprised to receive a scathing letter from the FCC which accused the wireless services provider of unfairly singling out unlimited data customers in its new high-usage throttling policy announced last month.
That policy is an expansion of Verizon’s existing “network optimization policy,” which currently limits the data usage of 3G customers. It is scheduled to begin in October, and will apply to 4G LTE smartphone customers on unlimited data plans who rank in the upper five percent of data users (which, as of March, would have been anyone using at least 4.7GB in a single month) and who are month-to month customers currently without a contract.
Mead has defended his company’s decision to expand the policy, Reardon explained, stating that FCC chairman Tom Wheeler misunderstood the policy and had made incorrect assumptions about the program. Mead also noted that the existing throttling plan, instituted in 2011 under a different FCC chairman, had drawn no criticism from the agency and that he did not understand why there would be any issues with the decision to expand it.
The company has also sent a letter to the FCC, emphasizing that customers would only experience slowdowns “under very limited circumstances” and only at specific cell sites currently experiencing “unusually high demand,” said Chris Welch of The Verge. Verizon also promised that any throttling would stop immediately once data demands returned to normal.
The Wall Street Journal reports that, in the original letter, Wheeler wrote that it was “disturbing to me that Verizon Wireless would base its ‘network management’ on distinctions among its customers’ data plans, rather than on network architecture or technology. ’Reasonable network management’ concerns the technical management of your network; it is not a loophole designed to enhance your revenue streams.”
As reported by Marina Lopes and Alina Selyukh of Reuters, Verizon senior vice president of federal regulatory affairs Kathleen Grillo responded, “The type of network optimization policy that we follow has been endorsed by the FCC as a narrowly targeted way to ensure a fair allocation of capacity during times of congestion.”
Grillo added that the practice “has been widely accepted with little or no controversy,” and Mead told reporters the expansion of the policy was “absolutely… the right thing to do” and that it was “in line with the FCC’s principles.” The CEO also said he was “surprised” to receive Wheeler’s letter, and while he had “great respect” for the agency, he was “not sure the chairman understood what we’re doing exactly.”
Verizon has previously explained that the types of activities which could result in slower connection speeds are viewing high-definition streaming video or playing real-time online gaming. The new restrictions will also include an exemption for any business or government organization that has signed a major account agreement, and is said to be the company’s attempt to ensure most users experience roughly the same data transfer rates.
“We understand that our customers rely on their smartphones and tablets every day,” VP for technology Mike Haberman said of the new policy in late July. “Our network optimization policy provides the best path to ensure a continued great wireless experience for all of our customers on the best and largest wireless network in the US.”
Shop Amazon – Contract Cell Phones & Service Plans

Butterflies Could Hold The Key To Probes That Repair Genes

Clemson University

New discoveries about how butterflies feed could help engineers develop tiny probes that siphon liquid out of single cells for a wide range of medical tests and treatments, according to Clemson University researchers.

The National Science Foundation recently awarded the project $696,514. It was the foundation’s third grant to the project, bringing the total since 2009 to more than $3 million.

The research has brought together Clemson’s materials scientists and biologists who have been focusing on the proboscis, the mouthpart that many insects used for feeding.

For materials scientists, the goal is to develop what they call “fiber-based fluidic devices,” among them probes that could eventually allow doctors to pluck a single defective gene out of a cell and replace it with a good one, said Konstantin Kornev, a Clemson materials physics professor. “If someone were programmed to have an illness, it would be eliminated,” he said.

Researchers recently published some of their findings about the butterfly proboscis in The Journal of Experimental Biology.

They are now advancing to a new phase in their studies. Much remains unknown about how insects use tiny pores and channels in the proboscis to sample and handle fluid.

“It’s like the proverbial magic well,” said Clemson entomology professor Peter Adler. “The more we learn about the butterfly proboscis, the more it has for us to learn about it.”

Kornev said he was attracted to butterflies for their ability to draw various kinds of liquids.

“It can be very thick like nectar and honey or very thin like water,” he said. “They do that easily. That’s a challenge for engineers.”

Researchers want the probe to be able to take fluid out of a single cell, which is 10 times smaller than the diameter of a human hair, Kornev said. The probe also will need to differentiate between different types of fluids, he said.

The technology could be used for medical devices, nanobioreactors that make complex materials and flying “micro-air vehicles” the size of an insect.

“It opens up a huge number of applications,” Kornev said. “We are actively seeking collaboration with cell biologists, medical doctors and other professionals who might find this research exciting and helpful in their applications.”

The study also is breaking new ground in biology. While scientists had a fundamental idea of how butterflies feed, it was less complete than it is now, Adler said.

Scientists have long known that butterflies use the proboscis to suck up fluid, similar to how humans use a drinking straw, Adler said. But the study found that the butterfly proboscis also acts as a sponge, he said.

“It’s a dual mechanism,” Adler said. “As they move the proboscis around, it can help sponge up the liquid and then facilitate the delivery of the liquid so that it can then be sucked up.”

> Explore Further…

Pistachios Could Lower Stress Response In People With Type 2 Diabetes

By Sheila G. West, Penn State

Among people with type 2 diabetes, eating pistachios may reduce the body’s response to the stresses of everyday life, according to Penn State researchers.

“In adults with diabetes, two servings of pistachios per day lowered vascular constriction during stress and improved neural control of the heart,” said Sheila G. West, professor of biobehavioral health and nutritional sciences. “Although nuts are high in fat, they contain good fats, fiber, potassium and antioxidants. Given the high risk of heart disease in people with diabetes, nuts are an important component of a heart healthy diet in this population.”

West and her colleagues investigated the effects of pistachios on responses to standardized stress tasks in patients with well-controlled Type 2 diabetes who were otherwise healthy. They used a randomized, crossover study design in which all meals were provided. Each of the diets contained the same number of calories.

After two weeks on the typical American diet — containing 36 percent fat and 12 percent saturated fats — participants were randomized to one of two test diets. During the four-week test diets, participants ate only food supplied by the study. The researchers reported the results of this study in a recent issue of the Journal of the American Heart Association.

Test diets included a standard heart-healthy diet — 27 percent fat and 7 percent saturated fat — and a diet containing two servings per day of pistachios — about 3 ounces or 20 percent of calories from pistachio nuts. The typical research participant consumed about 150 pistachio nuts per day. The pistachio diet contained 33 percent fat and 7 percent saturated fat. Half of the nuts consumed each day were salted and half were unsalted. At the end of each four-week diet period, the researchers measured blood pressure and total peripheral vascular resistance at rest and during two stress tests — a cold water challenge and a confusing mental arithmetic test.

“After the pistachio diet, blood vessels remained more relaxed and open during the stress tests,” West said.

Although laboratory measurements of blood pressure were not affected by pistachios, real-world measures of blood pressure (measured by an automated monitor) were significantly lower after the pistachio diet. Katherine A. Sauder, former graduate student in biobehavioral health, conducted these measurements.

“We found that systolic blood pressure during sleep was particularly affected by pistachios,” she said. “Average sleep blood pressure was reduced by about 4 points and this would be expected to lower workload on the heart.”

The researchers found that the pistachio diet lowered vascular constriction during stress. When arteries are dilated, the load on the heart is reduced. The physical challenge involved immersing one hand into icy water for two minutes.

“This cold stressor produces a large vascular constriction response in most people,” said West. “In comparison with a low fat diet, the pistachio diet blunted that vascular response to stress.”

The same pattern was seen when participants engaged in a challenging and confusing mental arithmetic task.

“Our participants still felt frustrated and angry during the math test,” West noted. “The pistachio diet reduced their bodies’ responses to stress, but nuts are not a cure for the emotional distress that we feel in our daily lives.”

Sauder added: “As in our last study of pistachios, we did not see lower blood pressure in the laboratory setting with this dose of nuts. However, we were surprised and pleased to see that 24-hour ambulatory blood pressure was lower after the pistachio diet.”

The researchers also recorded improvements in heart rate variability, a measure of how well the nervous system controls heart function. These data indicate that pistachios increased the activity of the vagus nerve, an important part of the parasympathetic nervous system that can be damaged with diabetes.

“If sustained with longer term treatment, these improvements in sleep blood pressure, vascular response to stress and vagal control of the heart could reduce risk of heart disease in this high risk group,” West said.

Other researchers on this study included Cindy E. McCrea, graduate student in biobehavioral health; Jan S. Ulbrecht, endocrinologist and professor of biobehavioral health and medicine; and Penny M. Kris-Etherton, Distinguished Professor of Nutritional Sciences.

Positive Emotions Could Also Play An Exacerbating Role In Deadly Eating Disorders

Robin Lally, Rutgers University
Rutgers study finds that positive emotions could play a role in the deadly disorder
Positive emotions – even those viewed through a distorted lens – may play an exacerbating role in fueling eating disorders like anorexia nervosa, which has a death rate 12 times higher for females between the ages of 15 and 24 than all other causes of death combined, according to a Rutgers study.
In research published in Clinical Psychological Science, Edward Selby, an assistant professor in the Department of Psychology, School of Arts and Sciences, measured over a two week period the emotional states of 118 women between the ages of 18-58 being treated for anorexia nervosa. Selby found that those in the study not only suffered from negative emotions but also felt emotionally positive, having a sense of pride over being able to maintain and exceed their weight-loss goals.
“What we think happens is that positive emotions become exaggerated and are rewarding these maladaptive behaviors,” said Selby. “Since only about one-third of women recover after treatment, what we need to do is gain a better understanding of why these positive emotions become so strongly associated with weight loss rather than with a healthy association such as family, school or relationships.”
Previous research into eating disorders has focused mainly on how negative emotions, like being sad, angry, or having a lack of control contribute to anorexia, an emotional disorder characterized by an obsessive desire to lose weight by refusing to eat. “Up until now,” Selby said, “there has been little analysis of empirical data that could help gain insight into how positive emotions are distorted by those suffering with the illness.”
In this study, Selby and his colleagues found that the women in the study who had the most difficulty understanding how to recognize when positive emotions were becoming skewed, engaged in more frequent anorexia-type behaviors like vomiting, laxative use, restricting calories, excessive exercise, checking body fat and constant weight checks.
“Women with anorexia are often in complex emotional places, that is why it is important to understand all we can about what they are getting out of this experience,” said Selby. “The more we know not only about the negative emotions, but also the positive emotions connected to this disease, the more likely we will be to treat this devastating illness.”
Much of the positive reinforcement that may lead women with anorexia to feel good about their situation could be coming from “Pro-Anorexic” websites, where it is not unusual for individuals suffering with anorexia to be applauded for their control and courage in obtaining extreme weight loss.
This link between positive emotions and weight-loss behaviors, Selby said, turns into a vicious cycle for some women suffering with eating disorders who continue to lose weight even after their goals are met.
Selby believes that more research is needed to find a way to redirect positive emotions associated with emaciation to other healthy activities, and determine how these feelings should be addressed in treating those with eating disorders.
Physical activity, for instance, might need to be looked at in a different way, Selby said. While there is debate about whether patients undergoing treatment for anorexia should be allowed to exercise, working out is an activity that makes them feel good. So instead of completely barring physical fitness, perhaps an individual who has gained pleasure from a sport like running could be steered toward a group activity like yoga, which is focused more on core strengthening and not weight loss, he said.
“Being in control is important for many of these women,” Selby said. “What we need to do is find a way to reconnect the positive emotions they feel in losing weight to other aspects of their lives that will lead to a more balanced sense of happiness.”
GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now

Newly Identified Vulnerability Could Potentially Compromise Commercial Airliners

redOrbit Staff & Wire Reports – Your Universe Online
The satellite communications equipment of passenger jets can be hacked through their wireless internet and inflight entertainment systems, claims one prominent cybersecurity researcher who has promised to reveal the details of his work Thursday at the annual Black Hat hacking conference in Las Vegas, Nevada.
According to Reuters reporter Jim Finkle, IOActive consultant Ruben Santamarta plans to discuss vulnerabilities he has discovered in aerospace satellite communication systems – a presentation that “is expected to be one of the most widely watched at the conference” and “could prompt a review of aircraft security.”
The 32-year-old Santamarta told Reuters he discovered the flaws in the communication systems by reverse engineering their firmware – in other words, decoding the software used to operate the equipment. Theoretically, hackers could use the onboard WiFi or inflight entertainment system to hack into its avionics equipment, allowing them to potentially disrupt the aircraft’s navigation and safety systems.
The systems specifically mentioned in the study were created by Cobham, Harris, Hughes Network Systems, Iridium Communications and Japan Radio. While Santamarta told Fingle that the hacks have only been tested in controlled environments (such as IOActive’s Madrid laboratory) and could be difficult to replicate under real world condition, he said that he decided to publicize his findings to encourage manufacturers to patch these security issues.
“Since the specific details of the exploit won’t be announced until Santamarta’s presentation later this week, we’re left guessing until then just how big of an issue this actually is. The cause for concern is clear, though,” said Adam Clark Estes of Gizmodo.
“If Santamarta’s claims check out, the exploit affects some of the most common satellite communications equipment on the market,” Estes added. “These systems are used not only in airplanes but also ships, military vehicles, as well as industrial facilities like oil rigs, gas pipelines, and wind turbines. The hack targets the equipment’s firmware and gives hackers the ability to manipulate the avionics system, which in turn could affect navigation.”
Ubergizmo’s Adnan Farooqui said that some of Santamarta’s findings have already been confirmed by Cobham and Iridium, although both manufacturers assure passengers the security risks are minimal. In fact, Cobham said that a hacker would have to actually be in the cockpit in order to gain access to its communications equipment.
Likewise, Harris representative Jim Burke told Reuters the company had reviewed the paper and had determined “the risk of compromise is very small,” and Iridium’s Diane Hockenberry added that the firm had determined “the risk to Iridium subscribers is minimal, but we are taking precautionary measures to safeguard our users.”
One of the vulnerabilities discovered by Santamarta, Finkle said, was that the equipment from all five companies required the use of hardcoded log-in credentials, which allow service technicians access to any piece of equipment using the same login and password. Unfortunately, hackers can retrieve those passwords by hacking into the equipment’s firmware, and then use those credentials to access sensitive systems.
Black Hat review board member Vincenzo Iozzo noted that Santamarta’s study marked the first time a researcher had identified significant vulnerabilities in satellite communications equipment, and that the “vulnerabilities he discovered are pretty scary just because they involve very basic security things that vendors should already be aware of.”
Shop Amazon – Contract Cell Phones & Service Plans

SUCCESS! Rosetta Makes History By Rendezvousing With Comet 67P

redOrbit Staff & Wire Reports – Your Universe Online
Following a decade-long journey that spanned more than six billion kilometers through space, the ESA’s Rosetta probe has arrived at comet 67P/Churyumov-Gerasimenko, officially making it the first mission to ever successfully complete such a rendezvous.
Rosetta, which launched from a European spaceport in French Guiana on March 2, 2004, arrived at 67P early Wednesday morning, the ESA reported. The satellite will now escort the comet while it orbits the Sun and heads back out towards Jupiter, and its Philae lander will be deployed to the comet’s surface in November 2014.
The successful completion of the missions opens “a new chapter in Solar System exploration,” the agency said in a statement.
[ Watch: Rosetta: Science On The Comet ]
“After ten years, five months and four days traveling towards our destination, looping around the Sun five times and clocking up 6.4 billion kilometers, we are delighted to announce finally ‘we are here’,” added ESA Director General Jean-Jacques Dordain. “Europe’s Rosetta is now the first spacecraft in history to rendezvous with a comet, a major highlight in exploring our origins. Discoveries can start.”
As it made its final approach, Rosetta completed the last of 10 thruster firings that had been scheduled to take place over a span of several months, allowing it to slow down to a pace equivalent to a person walking, explained Kenneth Chang of the New York Times. Upon its arrival, the orbiter was said to be traveling roughly two miles per hour relative to the speed of its target, and at a distance of approximately 60 miles.
That final burn was designed to help Rosetta “into the first leg of a series of triangular paths around the comet,” added Wall Street Journal reporter Gautam Naik. Each of those legs was expected to be roughly 100 km (62 miles) long and take between three and four days to complete, ultimately allowing the probe to not only study the comet up close and land a probe on the surface, but also allowing it to follow 67P around the sun – all of which will be astronomical firsts.
[ Watch: Rosetta: Philae’s Mission At Comet 67P ]
“Today’s achievement is a result of a huge international endeavor spanning several decades,” said Alvaro Giménez, ESA’s Director of Science and Robotic Exploration. “We have come an extraordinarily long way since the mission concept was first discussed in the late 1970s and approved in 1993, and now we are ready to open a treasure chest of scientific discovery that is destined to rewrite the textbooks on comets for even more decades to come.”
Currently, ESA officials report that both the comet and the orbiter are approximately 405 million km from Earth (roughly halfway between the orbits of Mars and Jupiter) and heading en route toward the inner Solar System at speeds of nearly 55,000 km per hour. Comet 67P is in an elliptical 6.5-year orbit which takes it from beyond Jupiter at its most distant to between the orbits of Mars and Earth at its closest to the sun.
An ESA project operated with some assistance from NASA, Rosetta will spend the next two years studying the comet’s environment as well as its nucleus. Earlier this month, it was revealed that the orbiter had already successfully taken the comet’s temperature for the first time, and had captured images of a coma surrounding the nucleus.
“The Rosetta mission is a significant test of Europe’s spacefaring ambitions. Scientists had to plan the 10-year trip through the solar system in minute detail. Orbiting and landing on a comet, whose surface properties are largely unknown, is no easy task either,” Naik said. The WSJ reporter added that “more than scientific credibility is at stake,” since the undertaking reportedly cost the ESA more than $1.7 billion dollars.
Image 2 (below): Artist impression of ESA’s Rosetta approaching comet 67P/Churyumov-Gerasimenko. The comet image was taken on 2 August 2014 by the spacecraft’s navigation camera at a distance of about 500 km. Spacecraft and comet are not to scale. Credit: Spacecraft: ESA/ATG medialab; Comet image: ESA/Rosetta/NAVCAM

Want To Know If Someone Is Narcissistic? Just Ask Them!

redOrbit Staff & Wire Reports – Your Universe Online
If you’re looking for a foolproof way to determine whether or not someone is a vain or egotistical, the authors of a new PLOS ONE study have come up with what they claim is a foolproof way to find out by asking just one simple question.
The question in question – “To what extent do you agree with this statement: ‘I am a narcissist.’”
For their study, Brad Bushman, a professor of communication and psychology at the Ohio State University, and his colleagues explain how they conducted a series of 11 experiments involving over 2,200 people of all ages, and had them answer the above question by rating themselves on a scale of 1 (not very true of me) to 7 (very true of me).
The researchers said the one-item questionnaire also included an explanation of the word narcissist, which they defined as meaning egotistical, self-focused, and vain. They added that results of their Single Item Narcissism Scale (SINS) correlated very closely with several other validated measures of narcissism – including the widely used Narcissistic Personality Inventory (NPI) which included 40 different questions that have to be answered.
“People who are not narcissists would never say that they are” because of the negative connotation of the traits associated with the term, co-author Sara Konrath, a researcher at Indiana University in Indianapolis, told Kim Painter of USA Today. “Somebody who is a narcissist doesn’t think it’s all that bad and is maybe even a little proud of it.”
“People who are willing to admit they are more narcissistic than others probably actually are more narcissistic,” Bushman, who was also joined on the study by colleagues from Gettysburg College, added in a statement. “People who are narcissists are almost proud of the fact. You can ask them directly because they don’t see narcissism as a negative quality – they believe they are superior to other people and are fine with saying that publicly.”
Konrath explained that it is important to understand narcissism because it has an impact on society that extends far beyond the individual narcissist’s life. For instance, she said that people who are self-centered and egotistical by nature tend to lack empathy, which is one of the primary catalysts behind charity work and philanthropy.
“Overall, narcissism is problematic for both individuals and society. Those who think they are already great don’t try to improve themselves,” Bushman added. “And narcissism is bad for society because people who are only thinking of themselves and their own interests are less helpful to others.”
“The results do seem like common sense,” said Washington Post reporter Rachel Feltman. “By definition, narcissists are egotistical, self-focused, and vain. It would follow that a true narcissist wouldn’t see self-absorption as something negative. And because narcissists tend to lack empathy, they’d probably have trouble understanding why a desire to put themselves first should be seen as a negative trait.”
However, the OSU professor emphasized that SINS should not be looked on as a replacement for longer, more in-depth narcissism questionnaires, such as the NPI. Those diagnostic tools provide more detailed information to researchers, including which specific type of narcissism that an individual has. The single-item test, however, can be useful for long surveys in order to avoid participants from becoming fatigued or distracted midway through.
The experiments also revealed that the SINS test was positively related to each of the seven subscales used by the NPI to measure the components of narcissism, including vanity, exhibitionism, exploitativeness, authority, superiority, self-sufficiency, and entitlement. One also reaffirmed previous research that narcissistic people were more likely to engage in risky sexual behaviors and less likely to be able to remain in long-term romantic relationships.
Bushman also explained that scoring higher on the one-question quiz was linked to both good things and bad things. Those with higher narcissism scores were said to have more positive feelings, more extraversion, and marginally less depression. However, they were also less agreeable and more prone to anger, shame, guilt and fear.
How narcissistic are you? Take the test here.

Planet-Like Object May Have Once Been As Hot As A Star

redOrbit Staff & Wire Reports – Your Universe Online
An object that is currently cool as a planet might have spent much of its youth as hot as a star, according to a new research paper appearing in the latest edition of the journal Monthly Notices of the Royal Astronomical Society.
The object currently fluctuates between 200 and 300 degrees Fahrenheit (100 to 150 degrees Celsius), which is in between the temperatures of Earth and Venus, lead investigator David Pinfield of the University of Hertfordshire and his colleagues explained. However, it also shows evidence of a possible ancient origin, which means that it could have experienced a large change in temperature and might have been as hot as a star for millions of years.
Identified by Pinfield and his colleagues using NASA’s Wide-field Infrared Survey Explorer (WISE) observatory, a space telescope launched in 2009 that has captured images of the entire sky in mid-infrared light, the object has been dubbed WISE J0304-2705 and is a member of the recently established “Y dwarf” class.
WISE J0304-2705 joins 20 other Y dwarfs discovered to date, and while its current temperature is similar to planets, the object is not a rocky world like Earth. Instead, it is a large ball of gas similar to Jupiter. Furthermore, the team also dispersed the light it emits into a spectrum, which allowed them to learn more about its history.
Among the Y dwarfs discovered to date, WISE J0304-2705 is defined as ‘peculiar’ due to unusual features in its emitted light spectrum. The reason that it underwent such extreme evolutionary cooling is because it is “sub-stellar,” the authors said, which means that its interior never becomes hot enough for hydrogen fusion to occur.
Hydrogen fusion is the process which has kept our sun hot for several billion years, and without an energy source helping the planet maintain a stable temperature, it is inevitable that it would begin to cool down. If WISE J0304-2705 is an ancient object, then its temperature evolution would have been a four-stage process.
During the first stage, which would have lasted approximately 20 million years, the object would have been as hot as a star – reaching temperatures of at least 5,100 degrees Fahrenheit (2800 degrees Celsius). After roughly 100 million years, it would have cooled down to about 2,700 degrees Fahrenheit (1500 degrees Celsius).
In its third stage, after one billion years had passed, it would have been down to about 1,800 degrees Fahrenheit (1000 degrees Celsius). Ultimately, billions of years later, WISE J0304-2705 would have cooled to its current planetary temperature of just 200-300 degrees Fahrenheit (100-150 degrees Celsius).
“Our measurements suggest that this Y dwarf may have a composition and/or age characteristic of one of the Galaxy’s older members,” Pinfield explained in a statement Tuesday. “This would mean its temperature evolution could have been rather extreme – despite starting out at thousands of degrees.”
WISE J0304-2705, which is located in the Fornax (Furnace) constellation, is reportedly as massive as 20-30 Jupiters combined, which the research team said is intermediate between the more massive stars and typical planets. The researchers also made observations using a trio of ground-based telescopes – the 8-meter Gemini South Telescope, the 6.5-meter Magellan Telescope and the ESO’s 3.6-meter New Technology Telescope.

Surprise Discovery Could See Graphene Used To Improve Health

Monash University

A chance discovery about the ‘wonder material’ graphene – already exciting scientists because of its potential uses in electronics, energy storage and energy generation – takes it a step closer to being used in medicine and human health.

Researchers from Monash University have discovered that graphene oxide sheets can change structure to become liquid crystal droplets spontaneously and without any specialist equipment.

With graphene droplets now easy to produce, researchers say this opens up possibilities for its use in drug delivery and disease detection.

The findings, published in the journal ChemComm, build on existing knowledge about graphene. One of the thinnest and strongest materials known to man, graphene is a 2D sheet of carbon just one atom thick. With a ‘honeycomb’ structure the ‘wonder material’ is 100 times stronger than steel, highly conductive and flexible.

Dr Mainak Majumder from the Faculty of Engineering said because graphene droplets change their structure in response to the presence of an external magnetic field, it could be used for controlled drug release applications.

“Drug delivery systems tend to use magnetic particles which are very effective but they can’t always be used because these particles can be toxic in certain physiological conditions,” Dr Majumder said.

“In contrast, graphene doesn’t contain any magnetic properties. This combined with the fact that we have proved it can be changed into liquid crystal simply and cheaply, strengthens the prospect that it may one day be used for a new kind of drug delivery system.”

Usually atomizers and mechanical equipment are needed to change graphene into a spherical form. In this case all the team did was to put the graphene sheets in a solution to process it for industrial use. Under certain pH conditions they found that graphene behaves like a polymer – changing shape by itself.

First author of the paper, Ms Rachel Tkacz from the Faculty of Engineering, said the surprise discovery happened during routine tests.

“To be able to spontaneously change the structure of graphene from single sheets to a spherical assembly is hugely significant. No one thought that was possible. We’ve proved it is,” Ms Tkacz said.

“Now we know that graphene-based assemblies can spontaneously change shape under certain conditions, we can apply this knowledge to see if it changes when exposed to toxins, potentially paving the way for new methods of disease detection as well.”

Commonly used by jewelers, the team used an advanced version of a polarized light microscope based at the Marine Biological Laboratory, USA, to detect minute changes to graphene.

Dr Majumder said collaborating with researchers internationally and accessing some of the most sophisticated equipment in the world, was instrumental to the breakthrough discovery.

“We used microscopes similar to the ones jewelers use to see the clarity of precious gems. The only difference is the ones we used are much more precise due to a sophisticated system of hardware and software. This provides us with crucial information about the organization of graphene sheets, enabling us to recognize these unique structures,” Dr Majumder said.

Dr Majumder and his team are working with graphite industry partner, Strategic Energy Resources Ltd and an expert in polarized light imaging, Dr. Rudolf Oldenbourg from the Marine Biological Laboratory, USA, to explore how this work can be translated and commercialized.

The research was made possible by an ARC Linkage grant awarded to Strategic Energy Resources Ltd and Monash University and was the first linkage grant for graphene research in Australia.

“We are so pleased to be associated with Dr Majumder’s team at Monash university. The progress they have made with our joint project has been astonishing,” he said.

The research was made possible by an ARC Linkage grant awarded to Strategic Energy Resources Ltd and Monash University and was the first linkage grant given to date for graphene research in Australia.

> Explore Further…

—–

SHOP NOW: AmScope B100B-MS 40X-2000X Biological Binocular Compound Microscope with Mechanical Stage

Surinam Toad, Pipa pipa

The Surinam Toad or Star-Fingered Toad, Pipa pipa, is a species of frog belonging to the Pipidae family.

It can be found in Bolivia, Colombia, Brazil, Ecuador, French Guiana, Suriname, Peru, Guyana, Trinidad and Tobago, and Venezuela. Its natural habitats are subtropical or tropical moist lowland forests, tropical or subtropical swamps, freshwater marshes, swamps, and sporadic freshwater marshes. Loss of habitat is the main threat to this toad.

This toad appears similar to a leaf. It’s almost totally flat, and the coloration is a mottled brown. Their feet are mostly webbed with the front toes having small and star-like appendages. These toads have had sizes recorded of close to 8 inches, even though 4 to 5 inches is much more typical. It has tiny eyes, no tongue, and no teeth.

These toads are best known for their extraordinary reproductive habits. Unlike most toads, the males of this species do not attract mates with croaks and other sounds frequently associated with these aquatic animals. Instead, they give off a sharp clicking sound by snapping the hyoid bone in their throat. The partners rise up from the floor while in amplexus and flip through the water in arcs. During each arc, the female discharges 3 to 10 eggs, which get embedded in the skin on her back by the movement of the male. After implantation, the eggs sink into the skin and create pockets over a period of several days, ultimately embracing the appearance of an abnormal honeycomb. The larvae develop to the tadpole stage inside these pockets, ultimately budding from the mother’s back as fully developed toads, though they’re less than an inch long. As soon as they emerge from the mother’s back, the toads initiate a largely solitary life.

Image Caption: The gray shape is the Suriname toad. Photo of Pipa pipa (Surinam toad) at the Steinhart Aquarium in San Francisco. Credit: Stan Shebs/Wikipedia (CC BY-SA 3.0)

Scientists Claim Hobbit-Like Remains Belonged To Human With Down Syndrome

redOrbit Staff & Wire Reports – Your Universe Online
Skeletal remains recovered from the Indonesian island of Flores over a decade ago are not a new species of “hobbit” sized human, but an ancient Homo sapien showing signs of abnormal development consistent with Down syndrome, an international team of researchers claim in a new Proceedings of the National Academy of Sciences study.
According to NBC News, the 15,000-year-old fossil identified as LB1 had previously been determined to be a new and distinct species known as Homo floresiensis. In the new study, however, scientists from the University of Adelaide in Australia, the National Institutes of Earth Sciences in China and Penn State University provide evidence that the creature in question was a regular human suffering from a pathological condition.
“The population that has become known as Homo floresiensis has been described as ‘the most extreme human ever discovered,’” the authors wrote in their study. While they said that LB1 was “unusual,” they noted that “craniofacial and postcranial characteristics originally said to be diagnostic of the new species are not evident in the other more fragmentary skeletons in the sample that resemble other recent small-bodied human populations in the region.”
“Here we demonstrate that the facial asymmetry, small endocranial volume, brachycephaly, disproportionately short femora, flat feet, and numerous other characteristics of LB1 are highly diagnostic of Down syndrome, one of the most commonly occurring developmental disorders in humans and also documented in related hominoids such as chimpanzees and orangutans,” they added.
The initial descriptions of Homo floresiensis focused on the unusual anatomical characteristics of LB1, the researchers explained. The creature reportedly had a cranial volume of only 380 milliliters (23.2 cubic inches), indicating that its brain was less than one third the size of an average modern human’s, and its short thigh bones suggested that the creature was only 1.06 meters (roughly 3.5 feet) tall.
While some of its traits were characterized as unique and indicative of a new species, others drew comparisons to earlier hominins, including Homo erectus and Australopithecus. After taking a close second look at the evidence, however, the authors discovered the original figure for cranial volume had been underestimated. They reported consistently finding cranial volumes of about 430 milliliters (26.2 cubic inches).
“The difference is significant, and the revised figure falls in the range predicted for a modern human with Down syndrome from the same geographic region,” Robert B. Eckhardt, professor of developmental genetics and evolution at Penn State, said in a statement. Likewise, the original height estimate was based in part on the short thigh bone of the creature, short thigh bones are also common in humans with Down syndrome.
“Unusual does not equal unique. The originally reported traits are not so rare as to have required the invention of a new hominin species,” Eckhardt added. “When we first saw these bones, several of us immediately spotted a developmental disturbance, but we did not assign a specific diagnosis because the bones were so fragmentary. Over the years, several lines of evidence have converged on Down syndrome.”
One potential indicator is the fact that the creature has an asymmetrical skull, which Discovery News explained is expected from a person with Down syndrome. In addition, this characteristic is one of many that appear only in the so-called hobbit skeleton, but not in other fossils recovered from the same location, the researchers noted.
“This work is not presented in the form of a fanciful story, but to test a hypothesis: Are the skeletons from Liang Bua cave sufficiently unusual to require invention of a new human species?” said Eckhardt. “Our reanalysis shows that they are not. The less strained explanation is a developmental disorder. Here the signs point rather clearly to Down syndrome, which occurs in more than one per thousand human births around the world.”

Email Scans Allow Google To Help Law Enforcement Nab Sex Offender

redOrbit Staff & Wire Reports – Your Universe Online
Google’s role in the arrest of a Houston, Texas man on child pornography charges has revealed that the Mountain View, California-based tech giant is quietly scanning its users’ emails for signs of illegal content.
According to Tim Wetzel of KHOU 11 News, 41-year-old John Henry Skillern was taken into custody after Google discovered that an email he was attempting to send contained explicit images of a young girl, and notified officials at the National Center for Missing and Exploited Children.
Skillern, who was already a registered sex offender after being convicted of sexually assaulting an eight year old boy in 1994, was taken into custody and charged with possession of child pornography, Wetzel said. Once they obtained a search warrant, investigators said they also discovered child porn on his smartphone and tablet computer.
“He was trying to get around getting caught, he was trying to keep it inside his email. I can’t see that information, I can’t see that photo, but Google can,” Detective David Nettles of the Houston Metro Internet Crimes Against Children Taskforce told KHOU last Wednesday. “I really don’t know how they do their job, but I’m just glad they do it.”
While Google said that they would not reveal technical information on any individual case, and would not divulge the exact details of the email searches that it conducts, Matthew Sparkes, Deputy Head of Technology with The Telegraph, explained that it uses an automatic search that compares the unique codes generated by images.
Those codes, known as hashes, are created by running an image through a simple algorithm, then comparing it to a database of hashes produced by known images of child abuse. Any match is said to be an almost certain indication that the account being reviewed is home to an illegal image, Sparkes explained, and the evaluation method keeps Google from having to maintain their own database of illegal images.
“Google has hinted in the past that it performs searches on content, although did not elaborate on whether this was public or private data,” he said. “Until now it has never been confirmed that Google trawls information that is not on the public internet, but is contained within our private accounts such as GMail email messages.”
Google’s tip allowed law enforcement officials to obtain a warrant, and ultimately find locally stored images that otherwise would have been undetectable, Engadget’s Jon Fingas said. The case is somewhat unique because there were no websites or other public clues to the offender’s activities – Google was the only one that knew that something was amiss.
“That will undoubtedly raise concerns for some, since it wasn’t immediately apparent that Mountain View’s servers were checking Gmail images,” he added. “However, the activity isn’t a complete surprise. Google’s terms of service already indicate that the company is analyzing Gmail for both targeted ads and security – while illegal pornography isn’t explicitly mentioned in the terms, it only makes sense that this content would be considered as well.”
“With the rate that Gmail messages are scanned, and the fact that all US companies are bound by US law to report suspected child abuse, it is hardly surprising that this individual has found themselves on the wrong side of the law,” Emma Carr, acting director of UK privacy organization Big Brother Watch, told BBC News. However, she also called on the company to make it clear “what procedures and safeguards are in place to ensure that people are not wrongly criminalized.”

Elon Musk: Artificial Intelligence Potentially More Dangerous To Humans Than Nuclear Weapons

redOrbit Staff & Wire Reports – Your Universe Online
SpaceX and Tesla Motors founder Elon Musk has serious concerns over the safety of artificial intelligence, using social media to issue a warning that AI could pose a more serious threat to humanity than nuclear weapons.
As reported by Alyssa Newcomb of ABC News, Musk took to Twitter over the weekend to post that people need to be “super careful” with AI, which he described as “potentially more dangerous than nukes.”
He made the comments after reading (and recommending) the book “Superintelligence” by Swedish philosopher and Oxford professor Nick Bostrom, explained VentureBeat’s Tom Cheredar. The book, which explores what will happen if and when machines become more intelligent than humans, will be released in the US on September 1.
According to Rob Wile of Business Insider, in a blurb about the book, Bostrom’s colleague Martin Rees of Cambridge University said that “those disposed to dismiss an ‘AI takeover’ as science fiction may think again after reading this original and well-argued book.”
Wile added that Bostrom was asked at a recent conference whether or not people should be scared of new technology. He responded “Yes,” but added that humanity had to be “scared about the right things. There are huge existential threats, these are threats to the very survival of life on Earth, from machine intelligence – not the way it is today, but if we achieve this sort of super-intelligence in the future.”
Musk appears to agree with the author’s assessment, although it is interesting to note that in March, he invested in California-based AI group Vicarious – a firm who hopes to design a “computer that thinks like a person… except it doesn’t have to eat or sleep,” co-founder Scott Phoenix said, according to Ellie Zolfagharifard of the UK newspaper the Daily Mail.
“I think there is potentially a dangerous outcome there,” Musk said previously in an interview with CNBC, according to Zolfagharifard. “There have been movies about this, you know, like Terminator. There are some scary outcomes. And we should try to make sure the outcomes are good, not bad.”
The Daily Mail writer added that Vicarious is currently trying to develop a program that mimics the brain’s neocortex – the top layer of the cerebral hemispheres in the brain of mammals. The neocortex is approximately three millimeters thick, and has six layers. Each of those layers is involved with a variety of different biological functions, including sensory perception, spatial reasoning, conscious thought, and language in humans.
Back in May, internationally recognized theoretical physicist Stephen Hawking expressed similar concerns after viewing the Johnny Depp film Transcendence, in which Depp’s character has his consciousness uploaded into a quantum computer, only to grow more powerful and become virtually omniscient.
Hawking said the movie should not be dismissed as science fiction, and that ignoring the story’s deeper lessons would be “a mistake, and potentially our worst mistake in history.” While advances in AI such as driverless cars and digital assistants are often looked at as beneficial to mankind, Hawking expressed concern that they could ultimately lead to our downfall, unless we prepare for the potential risks presented by independently-thinking technology.
“The potential benefits are huge; everything that civilization has to offer is a product of human intelligence; we cannot predict what we might achieve when this intelligence is magnified by the tools that AI may provide, but the eradication of war, disease, and poverty would be high on anyone’s list,” he wrote in a column for the British newspaper The Independent. “Success in creating AI would be the biggest event in human history. Unfortunately, it might also be the last, unless we learn how to avoid the risks.”

Light Amount Of Electronic Gaming Linked To Positive Psychosocial Adjustment

redOrbit Staff & Wire Reports – Your Universe Online
Playing video games for no more than one hour per day can have a small but positive impact on childhood development, according to new research published Monday in the journal Pediatrics.
In his paper, study author and University of Oxford experimental psychologist Dr. Andrew Przybylski found that children and teenagers who played video games for less than 60 minutes tended to be better adjusted than those who had never played or those who played for more than three hours each day.
“Despite widespread fears that video game usage is harmful,” Emily Gosden of The Telegraph explained that Dr. Przybylski’s research has revealed that children who participate in light daily gaming sessions “are happier, more sociable and less hyperactive than those who don’t play at all.”
Youngsters who used gaming consoles for over three hours every day expressed that they were less satisfied with their lives overall, noted BBC News health reporter Smitha Mundasad. The research also found no positive or negative effects associated of playing for one to three hours per day.
However, the research also suggests the overall influence of video games on children and teens, whether positive or negative, is extremely small in comparison to more “enduring” factors, such as the state of the child’s home, the quality of their relationships at school, and whether or not they are materially deprived.
“These results support recent laboratory-based experiments that have identified the downsides to playing electronic games,” Dr. Przybylski said in a statement. “However, high levels of video game-playing appear to be only weakly linked to children’s behavioral problems in the real world.”
“Likewise, the small, positive effects we observed for low levels of play on electronic games do not support the idea that video games on their own can help children develop in an increasingly digital world,” he added. “Some of the positive effects identified in past gaming research were mirrored in these data but the effects were quite small, suggesting that any benefits may be limited to a narrow range of action games.”
The doctor reviewed the survey results of 5,000 youngsters between the ages of 10 and 15, Mundasad said, and three-fourths of those who responded said that they played video games every day. The participants were asked to quantify how much time they participated in the hobby on a typical school day, then rated a series of other factors, including overall satisfaction with their lives, their relationships with peers, and their hyperactivity levels.
Those who played less than one hour per day were more likely to express satisfaction with their lives and demonstrated the highest levels of positive social interactions, outperforming even non-gamers in these categories. Furthermore, this group also had fewer problems with emotional issues and lower levels of hyperactivity. Those who spent over three hours each day on computers or consoles were found to be the least well adjusted.
“In a research environment that is often polarized between those who believe games have an extremely beneficial role and those who link them to violent acts, this research could provide a new, more nuanced standpoint,” Dr. Przybylski told BBC News. “Being engaged in video games may give children a common language. And for someone who is not part of this conversation, this might end up cutting the young person off.”
He also explained to Gosden that “high levels of video game-playing appear to be only weakly linked to children’s behavioral problems in the real world,” and that the small benefits for low levels of play “do not support the idea that video games on their own can help children develop in an increasingly digital world.”
The author went on to call for additional research to help determine which types of video games were the most harmful and the most beneficial, and said that there was “little scientific basis” to support recommended time limits on playing video games, according to The Telegraph.
Shop Amazon – Hot New Releases – Updated Every Hour

Caltech Team Develops Technique To Create See-Through Organs

redOrbit Staff & Wire Reports – Your Universe Online
In a discovery that could improve our general knowledge of biology, lead to more accurate clinical diagnoses and disease monitoring, and serve as a catalyst for new therapies for a variety of conditions, researchers from the California Institute of Technology (Caltech) have discovered a way to see through tissues and organs.
The study, which appears in the July 31 edition of the journal Cell, details simple methods that can make opaque organs, bodies, and human tissue biopsies transparent without altering the cellular structures and connections. It could help scientists who study developmental problems and diseases get a better look inside an organism to pinpoint the exact biological issue.
“Large volumes of tissue are not optically transparent – you can’t see through them,” senior author Viviana Gradinaru, an assistant professor of biology at Caltech and the principal investigator in the team behind the new technique, explained in a statement.
“So, if we need to see individual cells within a large volume of tissue,” such as a tumor biopsy, “we have to slice the tissue very thin, separately image each slice with a microscope, and put all of the images back together with a computer,” she added. “It’s a very time-consuming process and it is error prone, especially if you look to map long axons or sparse cell populations such as stem cells or tumor cells.”
According to Gradinaru and her colleagues, lipids are dispersed through cells and provide structural support, but they also prevent light from passing through those cells. However, the researchers have devised a way to save time by making an organism’s entire body clear, allowing them to view it in 3D using standard optical methods such as confocal microscopy (which increases resolution and contrast by eliminating out-of-focus light).
Previously, the Caltech team created a technique known as CLARITY, in which a rodent brain was infused with a solution of lipid-dissolving detergents and a water-based polymer hydrogel (which provided structural support). This made the tissues appear to be clear, but left its 3D architecture intact for study.
Building on that method, Gradinaru and her collaborators created a transparent whole-brain specimen. The new technique, perfusion-assisted agent release in situ (PARS), uses an organism’s own network of blood vessels to quickly deliver the lipid-dissolving hydrogel and chemical solution throughout the body.
Once the target area is made transparent, researchers can use standard microscopy techniques to easily look through a thick mass of tissue to focus on specific single cells that had been genetically marked with fluorescent proteins. Even without those proteins, however, the study authors said that the PARS method can be adapted to deliver stains and dyes to specific cell types when whole-body clearing is not required.
When used on an individual organ level, the technique is known as the passive clarity technique (PACT). To ensure that stripping lipids from cells did not remove other molecules, such as RNA, DNA or proteins, Gradinaru recruited Caltech colleague and assistant professor of chemistry Long Cai to verify that strands of RNA were still present and detectable with single-molecule resolution in the cells of the transparent organisms.
“Although the idea of tissue clearing has been around for a century, to our knowledge, this is the first study to perform whole-body clearing, as opposed to first extracting and then clearing organs outside the adult body,” explained Gradinaru. “Our methodology has the potential to accelerate any scientific endeavor that would benefit from whole-organism mapping, including the study of how peripheral nerves and organs can profoundly affect cognition and mental processing, and vice versa.”
“Our easy-to-use tissue clearing protocols, which employ readily available and cost-effective reagents and equipment, will make the subcellular interrogation of large tissue samples an accessible undertaking within the broader research and clinical communities,” she added.

Strengthening Of Pacific Trade Winds Attributed To Warming In The Atlantic

redOrbit Staff & Wire Reports – Your Universe Online

Climate scientists have finally solved the mystery as to why the equatorial Pacific trade winds, which were expected to get weaker due to increasing greenhouse gas levels, have actually gotten stronger in recent years.

For more than a decade, experts have wondered why the trade winds have behaved in contrast to climate models and become supercharged since the early 1990s. The phenomenon, according to a team of US and Australian scientists, is the result of recent rapid warming in the Atlantic Ocean likely due to global climate change.

Writing in the August 3 online edition of the journal Nature Climate Change, researchers explain that the Pacific trade winds are currently blowing at unprecedented levels, surpassing all records dating back to the 1860s. As a result, sea level rise in the western Pacific has been accelerated, which has an impact not just on the regional climate but global conditions as well.

“We were surprised to find the main cause of the Pacific climate trends of the past 20 years had its origin in the Atlantic Ocean,” co-lead author Dr. Shayne McGregor from the ARC Centre of Excellence for Climate System Science (ARCCSS) at the University of New South Wales said in a statement Sunday. “It highlights how changes in the climate in one part of the world can have extensive impacts around the globe.”

“We saw that the rapid Atlantic surface warming observed since the early 1990s, induced partly by greenhouse gasses, has generated unusually low sea level pressure over the tropical Atlantic,” he added. “This, in turn, produces an upward motion of the overlying air parcels. These parcels move westward aloft and then sink again in the eastern equatorial Pacific, where their sinking creates a high pressure system.”

The resulting difference in pressure between the Atlantic and the Pacific strengthened the Pacific trade winds, noted Dr. McGregor. The increase in the winds has caused cooling in the eastern tropical Pacific region, amplified the California drought, and slowed the increase of global average surface temperatures over the last 13 years.

The increased intensity of these winds, previously believed to be a response to Pacific decadal variability, could also be responsible for reducing the frequency of El Niño events over the past decade by cooling ocean surface temperatures in the eastern Pacific, University of New South Wales and University of Hawaii researchers said.

“The rising air parcels, over the Atlantic eventually sink over the eastern tropical Pacific, thus creating higher surface pressure there. The enormous pressure see-saw with high pressure in the Pacific and low pressure in the Atlantic gave the Pacific trade winds an extra kick, amplifying their strength,” co-lead author and Hawaii professor Axel Timmermann said. “It’s like giving a playground roundabout an extra push as it spins past.”

“Our study documents that some of the largest tropical and subtropical climate trends of the past 20 years are all linked: Strengthening of the Pacific trade winds, acceleration of sea level rise in the western Pacific, eastern Pacific surface cooling, the global warming hiatus, and even the massive droughts in California,” added co-author Malte Stuecker of the University of Hawaii Meteorology Department.

Co-author Matthew England from the University of New South Wales noted that it “will be difficult to predict when the Pacific cooling trend and its contribution to the global warming hiatus will come to an end. The natural variability of the Pacific, associated for instance with the El Niño-Southern Oscillation, is one candidate that could drive the system back to a more even Atlantic–Pacific warming situation.”

GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now

Experts Call For Ecological Impact Of Fracking To Become A ‘Research Priority’

redOrbit Staff & Wire Reports – Your Universe Online
The amount of natural gas produced from shale rock has increased by over 700 percent in the past seven years, leading the authors of a new Frontiers in Ecology and the Environment paper to call for the practice to become a “top research priority” so that they can better understand its environmental impact.
In the study, eight conservation biologists representing a variety of organizations and institutions cite new reasons why the scientific community, shale industry representatives and policymakers need to work together in order to limit the damage to the planet’s ecology resulting from the extraction of natural gas through hydraulic fracturing, also known as fracking.
“We can’t let shale development outpace our understanding of its environmental impacts,” co-author Morgan Tingley, a postdoctoral research associate at the Princeton University Woodrow Wilson School of Public and International Affairs, said in a statement. “The past has taught us that environmental impacts of large-scale development and resource extraction, whether coal plants, large dams or biofuel monocultures, are more than the sum of their parts.”
Hydraulic fracturing, which uses deep injection of high-pressure aqueous chemicals to create fractures that release trapped natural gas which is then extracted, is expected in increase exponentially over the next 30 years, the researchers explained. Before that can be allowed to happen, though, the researchers warn that the potential impact of chemical contamination from spills, well-casing failures and other accidents must first be determined.
The study also claims that one of the greatest threats to both plants and animals is the cumulative impact of rapid, widespread shale development, as each individual well allegedly contributes to collective air, water, noise and light pollution. Furthermore, the authors report that the lack of accurate and reliable record-keeping of spills, wastewater disposal and fracturing-related spills and accidents has made it difficult to learn more about the practice.
The researchers discovered that only five of the 24 US states that have active shale-gas reservoirs (Pennsylvania, Colorado, New Mexico, Wyoming and Texas) keep public records of spills and accidents, limiting the “direct and quantifiable evidence” of the impact of the shale gas industry’s impact on the environment. They also found that a single gas well requires between 3.7 acres and 7.6 acres (1.5 to 3.1 hectares) of vegetation to be cleared.
A review of chemical disclosure statements for 150 wells located in three top-gas producing US states discovered that two-thirds of those wells were fractured with at least one undisclosed chemical, and that some of the wells included in the chemical disclosure registry used fluids containing 20 or more undisclosed chemicals. The authors are concerned, considering that fracturing fluid and wastewater can include carcinogens and radioactive substances.
“Past lessons from large scale resource extraction and energy development – large dams, intensive forestry, or biofuel plantations – have shown us that development that outpaces our understanding of ecological impacts can have dire unintended consequences,” said Maureen Ryan, a research fellow in the University of Washington’s School of Environmental and Forest Sciences. “It’s our responsibility to look forward.”
Image 2 (below): As illustrated above, each gas well can act as a source of air, water, noise and light pollution that — individually and collectively — can interfere with wild animal health, habitats and reproduction. Of particular concern is the fluid and wastewater associated with hydraulic fracturing, or “fracking,” a technique that releases natural gas from shale by breaking the rock up with a high-pressure blend of water, sand and other chemicals. Credit: Image courtesy of Frontiers in Ecology and the Environment

Hashtag Help Might Be On The Way For Confused Twitter Users

redOrbit Staff & Wire Reports – Your Universe Online
If you’re a casual Twitter user who finds yourself confused when you see things like #tbt or #smh, the social network is reportedly testing a new feature that will hopefully demystify the confusing world of the hashtag.
According to The Wall Street Journal, a new feature included in the Twitter app for iOS includes an expanded description to select hashtag searches. For instance, the new system clarifies that #tbt means “Throwback Thursday” (usually used by people posting old photos on the correct day of the week) and #smh represents “shaking my head” (usually in disbelief at something).
Other hashtags discovered by WSJ reporters Elliot Bentley and Yoree Koh include #oitnb (for the TV series “Orange Is The New Black”), #lol (for the popular online video game “League of Legends”), #manutd (for the Manchester United soccer/football team), #hhldn (the media and technology event Hacks/Hackers London) and #rt (which, surprisingly, is said to represent media outlet Russia Today, not retweet as one might expect).
“The labeling gives the hashtags a sense of legitimacy and order as related to a certain event or subject,” Bentley and Koh said, adding that it “wasn’t clear how these labels were generated. Some included an option for users to rate their accuracy. Many hashtags, such as #MH17 for Malaysia Airlines Flight 17, didn’t trigger the feature.” A Twitter representative declined the Wall Street Journal reporters’ request for comment on the matter.
PCWorld contributor Nick Mediati pointed out that this apparent new feature is not yet available to everyone. As they did with the Mute tool and other various interface changes, Twitter typically rolls out changes to a limited group of users before making them available to the general public. The enhanced hashtag explanation is part of the microblogging website’s attempts to make their service more attractive to new users, Mediati added.
“Even with Twitter’s stock price on an upward climb after impressive Q2 results, the service still needs to attract more users into the fold if it’s to continue to satisfy shareholders,” said David Nield of Digital Trends. Explaining exactly what hashtags stand for “should make the network less daunting and more accessible to the average user,” he added, and are part of the company’s plans “to improve the way Twitter handles events and major topics.”
Of course, these changes could ultimately be moot, should Twitter opt to go ahead with plans originally discussed earlier this year that would completely do away with both hashtags and replies that require use of the ‘@’ symbol. In fact, in discussing the possibility of doing away with those features at a March media conference, Twitter’s Vivian Schiller referred to hashtags and @ symbols as “arcane” elements that the company wanted to move “into the background.”
“The hashtag may be too ingrained into people’s habits to kill off entirely – even Hollywood and big business has caught on to the practice,” said Mediati, “but providing additional information for popular hashtags should help bring some clarity to the sometimes confusing Twitterverse.”
Shop Amazon – Contract Cell Phones & Service Plans

Has NASA Confirmed The "Impossible" No-Fuel Space Engine Works?

redOrbit Staff & Wire Reports – Your Universe Online
An experimental microwave thruster that does not require fuel to operate has been dubbed the “impossible” space engine, but a team of NASA researchers has reportedly confirmed the system actually works.
According to Mike Wall of Space.com, researchers from the US space agency’s Houston-based Johnson Space Center evaluated the system and found that it actually is capable of generating a small amount of thrust. If this is true, it could lead space travel to become far less expensive and allow astronomers to explore more of the cosmos.
“Test results indicate that the RF [radio frequency] resonant cavity thruster design, which is unique as an electric propulsion device, is producing a force that is not attributable to any classical electromagnetic phenomenon and, therefore, is potentially demonstrating an interaction with the quantum vacuum virtual plasma,” the NASA team wrote in a study presented last Wednesday during the 50th Joint Propulsion Conference in Cleveland, Ohio.
“NASA is a major player in space science, so when a team from the agency this week presents evidence that ‘impossible’ microwave thrusters seem to work, something strange is definitely going on. Either the results are completely wrong, or NASA has confirmed a major breakthrough in space propulsion,” added David Hambling of Wired UK.
The device in question is known as the EmDrive, and it was invented several years ago by a UK scientist known as Roger Shawyer, according to Hambling. It converts electric power into thrust without requiring propellant by bouncing microwaves around inside a closed container, he added.
While Shawyer had constructed several demonstration systems, his relativity-based theory has been roundly rejected by critics who claim that it simply cannot work because it violates the law of conservation of momentum. However, in 2012, a team of Chinese Academy of Sciences researchers successfully built their own version of the system, and discovered that it is able to generate enough thrust to potentially power a satellite.
After building and testing their version of the EmDrive, they reported in November 2012 that their 2.45 GHz prototype successfully produced 720 micronewtons of thrust at an input power of 2.5kW, explained PC Mag writer Damon Poeter. Based on those numbers, the EmDrive would be capable of producing enough propulsion for “a practical satellite thruster,” said Wired UK.
After that, a US scientist by the name of Guido Fetta built his own device which is known as the Cannae Drive. Upon Fetta’s request, NASA warp drive researcher Sonny White and his colleagues tested the engine out during an eight-day span in August 2013, and found that is produces 30 to 50 micronewtons of thrust – a fraction of a percentage of that claimed by the Chinese team, but “emphatically a positive result,” according to the NASA team.
While both EmDrives and Cannae Drives are “clearly still in the experimental stages,” and there are questions surrounding the discrepancies in the results of each team’s efforts, Poeter said that the “innovative propulsion system is likely to get a long, studious look” from “a space community looking at everything from solar sails to ion drives as a means to travel more efficiently between the planets – as well as for potential asteroid deflection missions.”
However, John Timmer of ArsTechnica cautions against becoming too optimistic about these so-called “impossible” space engines, as there are still many questions surrounding the technology. For instance, despite the thrust recorded from an electric propulsion test, it turns out that thrust was also observed in an experiment during which the unit was not expected to produce any – in short, even the negative control in the experiment appeared to work.
According to Timmer, this result suggests “that the experiment as a whole tells you nothing. Clearly, the device (even when disabled) appears to produce a force.” There are several ways in which this could happen, he added, and there are ways that experts can monitor the device’s operation in order to see what factors could play a role. The force detected by the researchers could reportedly even be the result of a mass imbalance of less than 3mg.
FOR THE KINDLE: Space Technologies on Earth: redOrbit Press

Pepper And Halt: Spicy Chemical May Inhibit Gut Tumors

University of California, San Diego Health System

Researchers at the University of California, San Diego School of Medicine report that dietary capsaicin – the active ingredient in chili peppers – produces chronic activation of a receptor on cells lining the intestines of mice, triggering a reaction that ultimately reduces the risk of colorectal tumors.

The findings are published in the August 1, 2014 issue of The Journal of Clinical Investigation.

The receptor or ion channel, called TRPV1, was originally discovered in sensory neurons, where it acts as a sentinel for heat, acidity and spicy chemicals in the environment. “These are all potentially harmful stimuli to cells,” said Eyal Raz, MD, professor of Medicine and senior author of the study. “Thus, TRPV1 was quickly described as a molecular ‘pain receptor.’ This can be considered to be its conventional function, which all takes place in the nervous system.”

But Raz and colleagues have found that TPRV1 is also expressed by epithelial cells of the intestines, where it is activated by epidermal growth factor receptor or EGFR. EGFR is an important driver of cell proliferation in the intestines, whose epithelial lining is replaced approximately every four to six days.

“A basic level of EGFR activity is required to maintain the normal cell turnover in the gut,” said Petrus de Jong, MD, first author of the study. “However, if EGFR signaling is left unrestrained, the risk of sporadic tumor development increases.”

The scientists discovered that TRPV1, once activated by the EGFR, initiates a direct negative feedback on the EGFR, dampening the latter to reduce the risk of unwanted growth and intestinal tumor development. They found that mice genetically modified to be TRPV1-deficient suffered higher-than-normal rates of intestinal tumor growths.

“These results showed us that epithelial TRPV1 normally works as a tumor suppressor in the intestines,” said de Jong. In addition, molecular studies of human colorectal cancer samples recently uncovered multiple mutations in the TRPV1 gene, though Raz noted that currently there is no direct evidence that TRPV1 deficiency is a risk factor for colorectal cancer in humans.

“A direct association between TRPV1 function and human colorectal cancer should be addressed in future clinical studies,” he said.

But if such proves to be the case, the current study suggests one potential remedy might be spicy capsaicin, which acts as an irritant in mammals, generating a burning sensation in contact with tissue. Capsaicin is already broadly used as an analgesic in topical ointments, where its properties as an irritant overwhelm nerves, rendering them unable to report pain for extended periods of time. It’s also the active ingredient in pepper spray.

The researchers fed capsaicin to mice genetically prone to developing multiple tumors in the gastrointestinal tract. The treatment resulted in a reduced tumor burden and extended the lifespans of the mice by more than 30 percent. The treatment was even more effective when combined with celecoxib, a COX-2 non-steroidal anti-inflammatory drug already approved for treating some forms of arthritis and pain.

“Our data suggest that individuals at high risk of developing recurrent intestinal tumors may benefit from chronic TRPV1 activation,” said Raz. “We have provided proof-of-principle.”

> Explore Further…

History Of Culture Transformed Into Visual Representation

University of Miami

University of Miami Physicist on Multidisciplinary Team That Explored Cultural Migration Through Big Data Visualization

Quantifying and transforming the history of culture into visual representation isn’t easy. There are thousands of individual stories, across thousands of years, to consider, and some historical conditions are nearly impossible to measure.

Addressing this challenge, Dr. Maximilian Schich, associate professor of arts and technology at The University of Texas at Dallas, brought together a team of network and complexity scientists, including University of Miami physicist Chaoming Song, to create and quantify a big picture of European and North American cultural history.

Schich and his fellow researchers reconstructed the migration and mobility patterns of more than 150,000 notable individuals over a time span of two thousand years. By connecting the birth and death locations of each individual, and drawing and animating lines between the two locations, Schich and his team have made progress in our understanding of large-scale cultural dynamics.

The research is detailed in the article “Historical Patterns in Cultural History,” published Aug. 1 in the journal Science.

Song, an assistant professor in the Department of Physics at the University of Miami College of Arts and Sciences, is a co-author of the study. A statistical physicist, Song’s research lies in the intersection of statistical physics, network science, biological science and computational social science, broadly exploring patterns behind petabytes of data. Song’s role of this project was primarily data analysis and model development.

“My research approach is mainly based on statistical physics, a sub-branch of physics that helps to understand the connections between macroscopic phenomena and microscopic details,” Song said.

“The study draws a surprisingly comprehensive picture of European and North American cultural interaction that can’t be otherwise achieved without consulting vast amounts of literature or combing discrete datasets,” Schich said. “This study functions like a macro-scope, where quantitative and qualitative inquiries complement each other.”

Schich and his colleagues collected the birth and death data from three databases to track migration networks within and out of Europe and North America, revealing a pattern of geographical birth sources and death attractors.

“The resulting network of locations provides a macroscopic perspective of cultural history, which helps us retrace cultural narratives of Europe and North America using large-scale visualization and quantitative dynamical tools, and to derive historical trends of cultural centers beyond the scope of specific events, or narrow time intervals,” says Song.

Other findings show that despite the dependence of the arts on money, cultural centers and economic centers do not always coincide, and that the population size of a location does not necessarily point to its cultural attractiveness.  In addition, the median physical distance between birth and death locations changed very little, on average, between the 14th and 21st centuries, from about 214 kilometers (133 miles) to about 382 km (237 miles), respectively.

The topic of art and cultural history is an uncommon topic for papers in journals such as Science.

“A large amount of multidisciplinary expertise was necessary to arrive at the results we found.” Schich said. “The paper relies on the fields of art history, complex networks, complexity science, computational sociology, human mobility, information design, physics and some inspiration from systems biology.”

“There is an increasing realization that systems across different disciplines often share similar structural and dynamic properties,” said Song. “Such similarities offer new perspectives and unique opportunities for physicists to apply their methodologies on a much broader set of phenomena.”

Researchers involved in the studycame from the groups of Dirk Helbing at the ETH Zurich, Swiss Federal Institute of Technology and Albert-László Barabási at Northeastern University. Current affiliations of the team include the following institutions: Central European University in Budapest; Harvard Medical School; IBM Research; Indiana University; Ludwig-Maximilians-University in Munich; and University of Miami. Data was collected from Freebase.com, the Allgemeines Künstlerlexikon, the Getty Union List of Artist Names and the Winckelmann Corpus.

The research was funded by the German Research Foundation, the European Research Council and UT Dallas.

Can Teenagers Develop Fibromyalgia?

There are certain medical conditions out there that still leave the entire medical world confused and in debate. Fibromyalgia is one of them. Although not a disease proper, this set of symptoms has been long connected to many other diseases and medical conditions and for a very long time people thought it was just a form of depression (as it very frequently comes along with this mental disorder as well). However, more recently, medical specialists have started to agree on the fact that fibromyalgia should be researched as such.

Up to the moment, no research has been able to show exactly what it is that causes fibromyalgia. Therefore, no actual cure has been developed for it. Yet, the separate symptoms can be treated accordingly and many people have learned to live with this disorder as well as possible.

Fibromyalgia and What It Actually Is

Putting a definition on fibromyalgia is extremely difficult to make, as it is characterized by many symptoms and adjacent medical conditions. Fibromyalgia is a syndrome that is mostly characterized by widespread pain – but this is not its only symptom, which makes this particular medical condition quite difficult to diagnose. In fact, most of those suffering from fibromyalgia are very frequently diagnosed with other diseases and medical conditions such as depression, MS, and so on.

In addition to pain, there are many other symptoms patients experience. Some of them experience a larger number of them (or even all, sporadically or not), while others experience fewer of them. These symptoms can differ a lot from one person to another and they can be very misleading for the doctor putting the diagnosis. Here is a list of some of the most commonly encountered symptoms:

  • Depression
  • Anxiety
  • Sleeping issues (such as insomnia and/or the restless leg syndrome, or waking up as tired as you went to sleep)
  • Muscle painkMuscle twitching
  • The irritable bowel syndrome (characterized by bloating, gas, diarrhea, vomiting, nausea and so on)
  • Irritable bladder (frequent urination)
  • Fatigue (which can range from moderate to severe)
  • Lack of energy
  • Jaw tenderness
  • Facial tenderness
  • Migraines
  • Numbness and tingling in the legs, arms, hands
  • Swelling feeling without actual swelling

These symptoms can sometimes be experienced at a higher intensity when the patient is going through a stressful period, when there are hormonal fluctuations, when the patient is depressed, when there is a longer period of inactivity and so on.

fibromyalgia in teens

What Causes Fibromyalgia, Actually?

This is a question not just patients out there ask themselves every day, but also medical researchers. The truth is that fibromyalgia’s causes have not been determined exactly. There are many theories going around and there are also some risk factors, but sometimes the line between what appears to be a cause and what appears to be a symptom is very thin – which, again, makes it nearly impossible to actually develop a cure for this syndrome.

A lot of medical researchers argue that fibromyalgia is very much connected to the way in which the body senses pain. Due to the smaller amounts of serotonin secreted by the brain, the body may start feeling pain at a higher intensity than normally. However, this drop of the serotonin levels cannot be actually explained.

Some people also say that injury brought to the brain or to the vertebral spine can lead to the onset of fibromyalgia. Others connect fibromyalgia with depression and with the drop in the serotonin levels experienced by the patients with both conditions. And then are the researchers who point out that people coming from families where one member (or more) suffer from fibromyalgia show increased risk of developing the condition as well (and this may actually be related to some genes in the human body which appear to be altered in the case of those with fibromyalgia).

In addition to everything, poor physical condition can also influence whether a person will develop fibromyalgia or not and that is so because exercising releases important chemicals for the human body and it makes it generally more resistant to pain.

Fibromyalgia in Teens

Unfortunately, there are more than 5 million people living in the United States only who suffer from this terrible condition. Among them, teenagers and children do not commonly develop this syndrome, as women who are over the age of 18 seem to be the ones more at risk.

However, it is believed that a percentage that could be anything between 1% and 7% of the children are fibromyalgia diagnosed or suspects. If in the case of adults fibromyalgia is very difficult to diagnose, in the case of children and teens it can be even more difficult, as the symptoms appear to be even more elusive.

One of the most difficult things related to fibromyalgia in teens (and in adults as well) is related to the fact that all the symptoms seem to be inter-connected somehow and they all seem to be part of a vicious cycle. A teen who doesn’t sleep well will feel fatigue and a fatigue person can feel pain much more intensely than a person who is well rested, for example. The same thing goes with all the other symptoms displayed by fibromyalgia patients (be them teenagers or not).

Anxiety and depression, sleep disturbances, stomachaches, headaches, memory issues, dizziness – these are all commonly encountered in the case of teenagers and children who suffer from fibromyalgia. But one of the main things a doctor will do in order to actually diagnose the disease is pressing on the 18 so-called “tender points” in the human body.

If the teen displays pain in some or in all of these tender points, the doctor will proceed with further examinations that could reveal the presence of fibromyalgia. One of these tests, called FM/a is among the best ones to diagnose fibromyalgia and it analyzes some markers in the human blood that have been researched to belong only to people with fibromyalgia.

As for treatment, the doctor will probably not prescribe the same kind of medication as he/she will prescribe for an adult (as the safety and effectiveness of these drugs has not been sufficiently tested on children and teens).

However, he/she can prescribe pain medication, exercising and lifestyle changes, as well as a series of other treatments and remedies meant to ameliorate the situation for the teenagers with this syndrome.

Did Reduced Testosterone Levels Help Human Culture Advance?

redOrbit Staff & Wire Reports – Your Universe Online
Changes in the human skull occurring approximately 50,000 years ago indicate that the rise of culture occurred around the same time as a reduction in testosterone levels, according to new research appearing in the August 1 edition of the journal Current Anthropology.
In the study, lead author Robert Cieri, a biology graduate student at the University of Utah who began this work as a senior at Duke University, argue that people started making art and using advanced tools only after they became nicer to each other. Those gentler personalities required having slightly reduced testosterone levels, they added – a condition suggested by the more feminine features found in skulls recovered from that era.
“The modern human behaviors of technological innovation, making art and rapid cultural exchange probably came at the same time that we developed a more cooperative temperament,” Cieri explained in a statement Friday.
The research team based their theory on measurements of over 1,400 ancient and modern skulls. Their efforts revealed that more recent modern humans had rounder features and a much less prominent brow, and those changes can be traced back directly to the impact of testosterone levels on the skeleton.
They are not certain if the bones indicate that these individuals had less testosterone in circulation, or fewer receptors for the hormone. However, fellow investigators and Duke university animal cognition researchers Brian Hare and Jingzhi Tan state that this hypothesis is in line with what has previously been established in non-human species.
For example, selective breeding of Siberian foxes was eventually able to produce creatures that were less aggressive towards humans and were more juvenile in appearance and behavior after several generations. Hare said that observing a process that leads to these types of changes in other creatures might also explain human behavior.
“It might help explain who we are and how we got to be this way,” said Hare, who researches differences between humans and their closes ape relatives, the more aggressive chimpanzees and the more laid-back bonobos. Those apes develop differently, he said, and they respond to social stress in different ways.
Male chimpanzees experience a strong rise in testosterone during puberty, but bonobos do not, the researchers explained. Under stressful conditions, chimps produce more testosterone, but bonobos do not. Instead, they produce more of the stress hormone cortisol. In addition to differences in the social activity of each creature, their faces are also said to be different as well – a finding the study authors call relevant to their work.
Cieri’s team compared the brow ridge, facial shape and interior volume of skulls from three different groups of humans: 13 belonging to modern humans more than 80,000 years old, 41 from people who lived between 10,000 to 38,000 years ago, and 1,367 20th century skulls from 30 different ethnic populations. They found an overall reduction in brow ridge and shortening of the upper face over time, reflecting a gradual reduction in testosterone action.
“There are a lot of theories about why, after 150,000 years of existence, humans suddenly leapt forward in technology,” Karl Leif Bates of Duke University explained. “Around 50,000 years ago, there is widespread evidence of producing bone and antler tools, heat-treated and flaked flint, projectile weapons, grindstones, fishing and birding equipment and a command of fire.”
“Was this driven by a brain mutation, cooked foods, the advent of language or just population density? The Duke study argues that living together and cooperating put a premium on agreeableness and lowered aggression and that, in turn, led to changed faces and more cultural exchange,” Bates added.
Cieri concluded that as prehistoric men and women began living closer together and passing down new technologies, it would pretty much have to be a given that they would have to cooperate with and learn from each other. His research was supported by the National Science Foundation (NSF), the Leakey Foundation and the University of Iowa Orthodontics Department.

Rosetta Records Temperature, Observes Coma As It Nears Comet 67P

redOrbit Staff & Wire Reports – Your Universe Online
After a journey of more than six billion kilometers through the Solar System, the ESA’s Rosetta probe is closing in on comet 67P/Churyumov–Gerasimenko (67P), and with less than a week to go until its arrival, it recently managed to take the comet’s temperature for the first time and has captured images of a coma surrounding its nucleus.
Rosetta, which lifted off from a European spaceport in French Guiana in March 2004, has already journeyed around the Sun five times, picking up speed through a series of gravity-assisted swingbys around Earth and Mars in order to achieve an orbit similar to 67P’s. Its goal, the agency said, is to match the 55,000 km/h pace of the comet and travel alongside it.
Since early May, the probe’s controllers have been running it through a series of planned maneuvers designed to reduce its speed with respect to the comet by approximately 2800 km/h in order to ensure it arrives by August 6. In the meantime, however, Rosetta has been able to use its instruments to conduct a series of observations and measurements of its target, allowing astronomers to learn more about the unusual, rubber-duck shaped comet.
Using its infrared and thermal imaging spectrometer (VIRTIS), Rosetta conducted a series of observations from between July 13 and July 21. Those observations, which came when the comet was roughly 555 km away from the Sun, revealed that the average surface temperature of 67P was approximately -70 degrees Celsius – up to 30 degrees warmer than predicted for an ice-covered comet at that distance, according to the ESA.
“This result is very interesting, since it gives us the first clues on the composition and physical properties of the comet’s surface,” VIRTIS principal investigator Dr. Fabrizio Capaccioni said in a statement Friday.
He explained that the temperature measurements confirm most of the comet’s surface will be dusty, because darker material warms and emits heat more readily than ice when exposed to sunlight. The findings do not “exclude the presence of patches of relatively clean ice, however,” he added, “and very soon, VIRTIS will be able to start generating maps showing the temperature of individual features.”
In addition, images taken by Rosetta’s onboard scientific imaging system OSIRIS on July 25 clearly show signs of an extended coma surrounding the nucleus of 67P, according to NASA. The image covers an area of 150 by 150 square kilometers, said Luisa Lara from the Institute of Astrophysics in Andalusia, Spain, but scientists believe that it is actually much larger than that.
According to the ESA, OSIRIS previously detected a distinct rise in the comet’s activity which revealed a coma that spanned more than 1,000 km. While that activity ultimately died down, the new image confirms an extended coma close to the nucleus of 67P, where the particle densities are at their highest. They, too, predict that it extends much further into its surroundings.
Using the data collected by Rosetta on the satellite’s approach to the comet, the ESA said scientists have been able to learn more about how the comet behaves. Their analysis of this information will also allow them to determine whether or not there are “any localized spots of activity on the comet’s surface,” as well as how this will correlate to the apparently asymmetric nature of the coma and other aspects of its overall development.
OSIRIS, VITRIS and the other instruments on Rosetta and its lander will “provide a thorough description of the surface physical properties and the gases in the comet’s coma,” as well as keep track of changing conditions as the comet travels around the sun over the next year, said project scientist Matt Taylor.
“With only a few days until we arrive at just 100 km distance from the comet, we are excited to start analyzing this fascinating little world in more and more detail,” he added. The team has been conducting engine burns on a weekly basis throughout July in order to slow Rosetta down to prepare for its approach, and two short orbit insertion burns remain – one which will take place on August 3, and another which will occur on August 6.
Image 2 (below):  OSIRIS wide angle camera view of 67P/C-G’s coma on July 25, 2014. Credit: ESA/Rosetta/MPS for OSIRIS Team MPS/UPD/LAM/IAA/SSO/INTA/UPM/DASP/IDA

Scientist Warns Of The Threat Of Inevitable Solar Super Storms

April Flowers for redOrbit.com – Your Universe Online
University of Bristol’s professor of Aerospace Engineering Ashley Dale cautions that “solar super-storms” are going to cause “catastrophic” and “long-lasting” impacts if we continue to ignore the threat of such storms.
Dale is a member of the international task force SolarMAX, which was designed to identify the risks of a solar storm and how humanity could minimize the risks. He believes that a particularly violent solar storm is inevitable. Such a storm would wreak havoc on our communications systems and power supplies, effectively crippling transport, sanitation and medicine services. The study findings were reported in this month’s issue of Physics World.
“Without power, people would struggle to fuel their cars at petrol stations, get money from cash dispensers or pay online. Water and sewage systems would be affected too, meaning that health epidemics in urbanized areas would quickly take a grip, with diseases we thought we had left behind centuries ago soon returning,” Dale said.
Solar storms are the result of violent eruptions on the surface of the Sun. These storms are accompanied by coronal mass ejections (CMEs), which are the most energetic events in our solar system. CMEs involve huge bubbles of plasma and magnetic fields bursting from the surface of the Sun into space.
Most CMEs follow behind a massive release of energy in the form of gamma rays, x-rays, protons and electrons, known as a solar flare.
When a CME of sufficient magnitude tears into the Earth’s magnetic field and rips it apart, it is considered to be a solar super-storm. The Carrington Event of 1859 was the largest solar super-storm on record, named after astronomer Richard Carrington who spotted the preceding solar flare.
The Carrington Event released about 10^22 kilojoules (kJ) of energy. To compare, this is equivalent to 10 billion Hiroshima bombs exploding simultaneously. The CME released a trillion kilograms of charged particles that shot towards Earth at approximately 1800 miles per second. Luckily, our electronic infrastructure at the time consisted of about 1.2 million miles of telephone wire, so the effect was relatively small.
NASA agrees with Dale that these types of events are inevitable. In fact, NASA predicts that the Earth is in the path of a Carrington-level approximately every 150 years. This puts us about 5 years overdue, and NASA says there is a 12 percent chance that one will occur in the next decade.
Because of the potential damage such a CME could cause to our electronic infrastructure now, SolarMAX’s 40 international members were tasked identifying the best ways of limited the potential damage of a solar super-storm. They met at the International Space University in Strasbourg, France, last year.
According to a sub-group of SolarMAX, space weather forecasting is our best solution. An array of 16 lunchbox-sized cube satellites could be put into orbit around the Sun to provide us with up to a week’s notice of solar events, telling us when, where and at what magnitude they might occur. This would provide adequate time to switch off vulnerable power lines, re-orient satellites, ground airplanes and start national recovery programs.
In contrast, Dale says the best solution is to design spacecraft and aircraft so that sensitive, on-board instruments are better protected. Such designs would include redistributing the existing internal architecture of the craft, putting sensitive payloads into areas where they would be surrounded by non-sensitive bulk material such as polyethylene, aluminum and water.
“As a species, we have never been more vulnerable to the volatile mood of our nearest star, but it is well within our ability, skill and expertise as humans to protect ourselves,” Dale concluded.
Join Amazon Student – FREE Two-Day Shipping for College Students

Genetic Biomarker Discovery Could Lead To Development Of Blood Test For Suicide

redOrbit Staff & Wire Reports – Your Universe Online
Newly discovered chemical changes in the body could ultimately lead to a blood test to help predict whether or not a person is at risk of committing suicide, according to new research appearing in the online edition of The American Journal of Psychiatry.
The discovery, which was made by scientists from the Johns Hopkins University School of Medicine Department of Psychiatry and Behavioral Sciences and the Johns Hopkins Bloomberg School of Public Health Department of Mental Health, centers around an alteration that occurs in a single human gene linked to stress reactions.
Changes occurring in this particular gene, which plays a role in how the brain responds to stress hormones, have a significant impact in determining if an individual has an unremarkable response to the strains of day-to-day life, or if those pressures become the catalyst for suicidal thoughts and actions. If confirmed in larger studies, this finding could provide doctors with a simple blood test to reliably predict whether or not a person is at risk of killing themselves.
“We have found a gene that we think could be really important for consistently identifying a range of behaviors from suicidal thoughts to attempts to completions,” Zachary Kaminsky, the lead study author and an assistant professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine, said in a statement. “With a test like ours, we may be able to stem suicide rates by identifying those people and intervening early enough to head off a catastrophe.”
Kaminsky, an assistant professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine, and his colleagues conducted a series of experiments focused on a genetic mutation the SKA2 gene, explained Bloomberg reporter Angela Zimm. SKA2 is expressed in the brain’s prefrontal cortex and is involved in inhibiting negative thoughts and controlling impulsive behaviors.
By analyzing brain samples of both healthy and mentally ill individuals, the authors discovered that those who had died as a result of suicide had reduced levels of SKA2, as well as elevated levels of methyl chemicals. The cause was attributed to an epigenetic modification that altered the way the SKA2 gene functioned without changing its underlying DNA sequence – an alternation which added methyl chemical groups to the gene.
In another portion of the study, Kaminsky’s team analyzed three different sets of blood samples, including one involving more than 300 participants in the Johns Hopkins Center for Prevention Research Study. That investigation revealed similar methylation increases at SKA2 in individuals with suicidal thoughts or attempts.
Using this data, the research team developed a model which successfully predicted which participants had tried to kill themselves or were experiencing suicidal thoughts. The model correctly identified 80 percent of those men and women, and predicted more severe suicide risk with 90 percent accuracy, the researchers said.
Furthermore, in the youngest data set, blood test results were 96 percent accurate in determining if a participant had attempted suicide. Kaminsky believes that a diagnostic test based on these findings could be used to predict potential suicide attempts in those who are ill, to help decide how intense potential intervention strategies should be, and to help doctors and hospitals determine if some patients should not receive medications linked to suicidal thoughts.
“Suicide is a major preventable public health problem, but we have been stymied in our prevention efforts because we have no consistent way to predict those who are at increased risk of killing themselves,” said Kaminsky. “With a test like ours, we may be able to stem suicide rates by identifying those people and intervening early enough to head off a catastrophe.”

Hummingbird Wings Generate Lift More Efficiently Than The Best Micro-Helicopter Blades

redOrbit Staff & Wire Reports – Your Universe Online
Hummingbird wings are more efficient than even the highest-quality helicopter blades when it comes to generating lift, according to new research appearing in the current issue of the Journal of the Royal Society: Interface.
However, experiments conducted by Stanford University professor David Lentink indicate that the gap between nature and human engineering is closing. While the best hummingbird was found to be over 20 percent more efficient than the world’s most advanced micro-helicopters, BBC News science reporter Victoria Gill indicates that average hummingbirds and helicopters are just about equal.
The experiment involved pinning the wings of 12 different hummingbird species, all of which were obtained from an existing museum collection, to a special device designed to test the aerodynamics of helicopter blades. Lentink and his colleagues used cameras to visualize airflow around the wings, and measures the drag and lift force they exerted at various angles and speeds using sensitive load cells.
Afterwards, the study authors repeated the experiment using the blades from a ProxDynamics Black Hornet, which is said to be the most sophisticated type of autonomous micro-helicopter on the market today. The Black Hornet is roughly the same size as a hummingbird, and is being used by the UK military during its ongoing campaign in Afghanistan. Regardless, the study revealed the cutting-edge technology came up short against the miracle of nature.
“Even spinning like a helicopter, rather than flapping, the hummingbird wings excelled: If hummingbirds were able to spin their wings to hover, it would cost them roughly half as much energy as flapping,” the university explained in a statement. “The microcopter’s wings kept pace with the middle-of-the-pack hummingbird wings, but the topflight wings – those of Anna’s hummingbird, a species common throughout the West Coast – were still about 27 percent more efficient than engineered blades.”

Lentink said he was not surprised by the results, as previous research had indicated hummingbird wings were incredibly efficient. However, he was impressed with the helicopter’s performance. He added that he believed the experiment indicates that the rotor power of a microcopter could still be improved by up to 27 percent.
“A helicopter is really the most efficient hovering device that we can build. The best hummingbirds are still better, but I think it’s amazing that we’re getting closer,” he said. “It’s not easy to match their performance, but if we build better wings with better shapes, we might approximate hummingbirds.”
Hummingbirds are the only birds capable of sustained hovering, Gill said, and the experiment indicates that they use less power to overcome drag than the manmade blades of a similarly-sized helicopter. However, the professor told BBC News it also demonstrates that if the wing designs of drones are improved, that they can be built to hover equally as efficiently – if not more efficiently – than even the top hummingbird species.
“This shows that if we design the wings well, we can build drones that hover as efficiently, if not more efficiently, as hummingbirds,” said Lentink. “Clearly we are not even close to hummingbirds in many other design metrics, such as wind gust tolerance, visual flight control through clutter, to name a few. But if we focus on aerodynamic efficiency, we are closer than we perhaps ever imagined possible.”
The experiment also revealed that the muscles of a hummingbird produce 130 watts of energy per kilogram, far more than the 100 watts per kilogram average managed by other birds and most other vertebrates. However, the study authors are still unclear how a hummingbird manages to maintain their flight when faced with strong wind gusts, how they navigate through things like branches, or how they can change their direction so quickly.
Lentink said he also wants to study the aspect ratios of hummingbird wings, which at 3.9 are much lower than most wings used in aviation – including those of the Black Hornet (4.7). The professor said he wants “to understand if aspect ratio is special, and whether the amount of variation has an effect on performance.”
Songbird Essentials – Copper Hummingbird Swing

Homeland Security Warns Of New ‘Backoff’ Point-Of-Sale Malware

redOrbit Staff & Wire Reports – Your Universe Online
The US Department of Homeland Security (DHS) is warning telecommuting employees and independent contractors about a new type of malware that can be used to infiltrate retailer computer systems.
In a lengthy report issued Thursday, the DHS said the hackers force their way into administrator-level or restricted-access accounts through remote desktop software such as those offered by Microsoft, Apple and Google. Once they gain access, they then deploy the point-of-sale (PoS) malware program known as Backoff to extract consumer payment information using an encrypted POST request.
The DHS describes Backoff as a family of PoS malware that has been linked to at least three separate cyberattacks, according to CIO Today’s Shirley Siluk. It was first detected in October 2013, and several new versions have been identified since then. Furthermore, some variants of the malware are said to be largely undetectable by standard antivirus programs.
“It’s completely new malware. Nobody has seen it before,” Karl Sigler, manager of threat Relevant Products/Services security at Chicago-based computer security firm Trustwave (which assisted in the preparation of the report), told Siluk. Despite the previous difficulties in detecting Backoff, however, Sigler and her colleagues believe that the public release of the report could help antivirus companies develop new ways to protect against the threat.
According to Charlie Osborne of ZDNet.com, the 10-page document says that Backoff could not only harm businesses, but could also allow sensitive information such as customer names, addresses and credit card numbers to fall into the wrong hands – allowing hackers to use the information to make fraudulent purchases or commit identify theft.
“For limiting the risk of compromise with this malware, organizations should educate employees and provide an approved method for remote access. Companies should also perform network scans to see if systems have specific ports enabled to provide the remote access services, then follow up to turn off the service,” Joe Schumacher, security consultant at security and risk management consulting company Neohapsis, told Osborne.
Nicole Perlroth of the New York Times added that the report “provides insight into what retailers are up against as hackers find ways into computer networks without tripping security systems,” and should also serve as “a reminder that a typical network is more a sprawl of loosely connected computers than a walled fortress, providing plenty of vulnerabilities – and easily duped humans – for determined hackers.”
Perlroth added that Backoff and its variants perform four different functions: they scrape the memory of in-store payment systems to obtain data from credit and debit cards (including account numbers, expiration dates and PIN numbers); they log keystrokes, such as when customers manually enter PINs, and transmits it back to the source computer; it installs a backdoor into in-store payment machines; and it continually alters the program in order to add new functions or keep it from being detected by cybersecurity experts.
The DHS does not report the victims of the attack, citing the agency’s policy of not commenting on ongoing investigations. However, the New York Times reported that two individuals with knowledge of the investigations said that over a dozen retailers had been affected, including Target, PF Chang’s, Neiman Marcus, Michaels, Sally Beauty Supply and Goodwill Industries International, which was attacked just last month.

Hubble Telescope Locates Most Distant Lensing Galaxy Ever Discovered

redOrbit Staff & Wire Reports – Your Universe Online
A massive new elliptical galaxy unexpectedly discovered using NASA’s Hubble Space Telescope is the most distant lensing galaxy ever discovered, officials from the US space agency confirmed in a statement Thursday.
Lensing galaxies essentially act as a cosmic magnifying glass, and are large enough that their gravity can bend and distort light in such a way that they can make it easier to detect more distant objects in space. Light from the newly discovered galaxy takes a reported 9.6 billion years to reach us – 200 million years more than the previous record holder, according to Rachel Feltman of The Washington Post.
Furthermore, the objects it is magnifying is a tiny spiral galaxy located some 10.7 billion light years away, and thanks to the discovery of the new lensing galaxy, we can currently observe the galaxy as it experiences a tremendous surge in star formation, Feltman added. The discovery of lensing galaxies such as this one will help astronomers learn more about how early universe galaxies grow and become filled with dark matter as they age.
“When you look more than 9 billion years ago in the early universe, you don’t expect to find this type of galaxy lensing at all,” explained lead researcher Kim-Vy Tran of Texas A&M University. “It’s very difficult to see an alignment between two galaxies in the early universe.”
“Imagine holding a magnifying glass close to you and then moving it much farther away,” Tran added. “When you look through a magnifying glass held at arm’s length, the chances that you will see an enlarged object are high. But if you move the magnifying glass across the room, your chances of seeing the magnifying glass nearly perfectly aligned with another object beyond it diminishes.”
According to research team members Kenneth Wong and Sherry Suyu of Academia Sinica Institute of Astronomy & Astrophysics (ASIAA), this lensing alignment allowed them to measure the total mass of the gigantic galaxy, including its dark matter, by gauging the intensity of the lensing effect on the light of the background galaxy.
The closer galaxy weight approximately 180 billion times more than our sun, making it a truly massive galaxy for its age. It is also one of the brightest members of a distant galactic cluster known as IRC 0218, they added. While these cosmic magnifying glasses are not particularly rare, Wong said that it is unusually to find one so far away.
“There are hundreds of lens galaxies that we know about, but almost all of them are relatively nearby, in cosmic terms,” he said. “To find a lens as far away as this one is a very special discovery because we can learn about the dark-matter content of galaxies in the distant past. By comparing our analysis of this lens galaxy to the more nearby lenses, we can start to understand how that dark-matter content has evolved over time.”
The discovery also marks another milestone for Hubble, whose instruments continue to operate effectively some five years after it was last serviced by a space shuttle, noted William Harwood of CBS News. Even though one of its six stabilizing gyroscopes has failed, NASA officials remain optimistic that it will continue to operate through 2020, which would allow it to work alongside its successor, the $8 billion James Webb Space Telescope.
John Grunsfeld, director of space science operations for the US space agency and one of the spacewalkers who helped service Hubble back in 2009, said that the telescope was “doing great,” and that if NASA can keep it operational until the launch of the Webb telescope, it would provide a “tremendous” research opportunity.
Keep an eye on the cosmos with Telescopes from Amazon.com

Tree Nuts Can Provide Modest Decreases In Blood Fats And Sugars

April Flowers for redOrbit.com – Your Universe Online
The University of Toronto and St. Michael’s Hospital led an international team of scientists in conducting two meta-analyses involving tree nuts. Examples of tree nuts include: almonds, Brazil nuts, cashews, hazelnuts, macadamias, pecans, pine nuts, pistachios and walnuts.
The first analysis, published online in the British Medical Journal Open examined the effects of tree nuts on metabolic syndrome (MetS) criteria. The findings revealed that the consumption of tree nuts resulted in a significant decrease in triglycerides and fasting blood glucose.
The researchers, including Dr. John Sievenpiper from the Clinical Nutrition and Risk Factor Modification Center at St. Michael’s, say that two of the five risk factors for MetS are reduced by eating tree nuts. These factors raise the risk for heart disease and other health problems such as diabetes and stroke.
The research team screened 2,000 articles published in peer-reviewed journals to find 49 randomized control trials with 2,000 participants. The participants consumed approximately 50 grams, or 1.5 servings, per day. Most North Americans, according to Sievenpiper, consume less than one serving a day.
A person with MetS presents at least three of the following risk factors: low levels of “good” cholesterol; high triglycerides; high blood pressure; high blood sugar; and extra weight around the waist.
Tryglycerides and blood glucose are types of blood fats. Sievenpiper and his colleagues found the largest reduction in these fats when tree nuts replaced refined carbohydrates rather than saturated fats. Even though nuts are high in calories and have a high fat content (unsaturated), the team found no adverse impact on the other factors for MetS.
“We found that tree nut consumption of about two ounces per day was found to decrease triglycerides significantly by ~0.06 mmol/L and to decrease fasting blood glucose significantly by ~0.08 mmol/L over an average follow-up of eight weeks,” stated Cyril Kendall, Ph.D., of the University of Toronto.
The second paper, published online in PLOS ONE, found that tree nut consumption also aids in lowering and stabilizing blood sugar levels in Type 2 diabetes sufferers when compared to those on a control diet.
The researchers examined 12 clinical trials with 450 participants. The data revealed that eating approximately two servings of tree nuts a day improved the two key markers of blood sugar: the HbA1c test, which measures blood sugar levels over three months, and the fasting glucose test, where patients are not allowed to eat or drink anything but water for eight hours before their blood glucose levels are tested.
Again, Sievenpiper said that the best results were observed when tree nuts replace refined carbohydrates rather than saturated fats.
In these trials, the participants consumed 56 grams, or two servings, of tree nuts a day. Again, no adverse effects were seen in the areas of weight gain. “Tree nuts are another way people can maintain healthy blood sugar levels in the context of a healthy dietary pattern,” he said.
According to Dr. Kendall, “Both of our analyses indicate that daily tree nut consumption has an overall metabolic benefit and can improve risk factors for metabolic syndrome, and glycemic control in individuals with type 2 diabetes.”
“With MetS and diabetes on the rise worldwide, this is yet another reason to include tree nuts in your diet every day,” states Maureen Ternus, M.S., R.D., Executive Director of the International Tree Nut Council Nutrition Research & Education Foundation (INC NREF). INC NREF provided funding for the studies. “In 2003, FDA (in its qualified health claim for nuts and heart disease) recommended that people eat 1.5 ounces of nuts per day—well above current consumption levels. We need to encourage people—especially those at risk for MetS and those with diabetes—to get their handful of nuts every day.”

Kids With Autism And Sensory Processing Disorders Show Differences In Brain Wiring

University of California – San Francisco

UC San Francisco study builds on its groundbreaking research showing children with sensory processing disorders have measurable brain differences

Researchers at UC San Francisco have found that children with sensory processing disorders have decreased structural brain connections in specific sensory regions different than those in autism, further establishing SPD as a clinically important neurodevelopmental disorder.

The research, published in the journal PLOS ONE, is the first study to compare structural connectivity in the brains of children with an autism diagnosis versus those with an SPD diagnosis, and with a group of typically developing boys. This new research follows UC San Francisco’s groundbreaking study published in 2013 that was the first to find that boys affected with SPD have quantifiable regional differences in brain structure when compared to typically developing boys. This work showed a biological basis for the disease but prompted the question of how these differences compared with other neurodevelopmental disorders.

“With more than 1 percent of children in the U.S. diagnosed with an autism spectrum disorder, and reports of 5 to 16 percent of children having sensory processing difficulties, it’s essential we define the neural underpinnings of these conditions, and identify the areas they overlap and where they are very distinct,” said senior author Pratik Mukherjee, MD, PhD, a professor of radiology and biomedical imaging and bioengineering at UCSF.

SPD can be hard to pinpoint, as more than 90 percent of children with autism also are reported to have atypical sensory behaviors, and SPD has not been listed in the Diagnostic and Statistical Manual used by psychiatrists and psychologists.

“One of the most striking new findings is that the children with SPD show even greater brain disconnection than the kids with a full autism diagnosis in some sensory-based tracts,” said Elysa Marco MD, cognitive and behavioral child neurologist at UCSF Benioff Children’s Hospital San Francisco and the study’s corresponding author. “However, the children with autism, but not those with SPD, showed impairment in brain connections essential to the processing of facial emotion and memory.”

Children with SPD struggle with how to process stimulation, which can cause a wide range of symptoms including hypersensitivity to sound, sight and touch, poor fine motor skills and easy distractibility. Some SPD children cannot tolerate the sound of a vacuum, while others can’t hold a pencil or struggle with emotional regulation. Furthermore, a sound that is an irritant one day can be tolerated the next. The disease can be baffling for parents and has been a source of much controversy for clinicians who debate whether it constitutes its own disorder, according to the researchers.

“These kids, however, often don’t get supportive services at school or in the community because SPD is not yet a recognized condition,” said Marco. “We are starting to catch up with what parents already knew; sensory challenges are real and can be measured both in the lab and the real world. Our next challenge is to find the reason why children have SPD and move these findings from the lab to the clinic.”

In the study, researchers used an advanced form of MRI called diffusion tensor imaging (DTI), which measures the microscopic movement of water molecules within the brain in order to give information about the brain’s white matter tracts. The brain’s white matter forms the “wiring” that links different areas of the brain and is therefore essential for perceiving, thinking and action. DTI shows the direction of the white matter fibers and the integrity of the white matter, thereby mapping the structural connections between brain regions.

The study examined the structural connectivity of specific white matter tracts in16 boys with SPD and 15 boys with autism between the ages of 8 and 12 and compared them with 23 typically developing boys of the same age range.

The researchers found that both the SPD and autism groups showed decreased connectivity in multiple parieto-occipital tracts, the areas that handle basic sensory information in the back area of the brain. However, only the autism cohort showed impairment in the inferior fronto-occipital fasciculi (IFOF), inferior longitudinal fasciculi (ILF), fusiform-amygdala and the fusiform-hippocampus tracts – critical tracts for social-emotional processing.

“One of the classic features of autism is decreased eye-to-eye gaze, and the decreased ability to read facial emotions,” said Marco. “The impairment in this specific brain connectivity, not only differentiates the autism group from the SPD group but reflects the difficulties patients with autism have in the real world. In our work, the more these regions are disconnected, the more challenge they are having with social skills.”

Kids with isolated SPD showed less connectivity in the basic perception and integration tracts of the brain that serve as connections for the auditory, visual and somatosensory (tactile) systems involved in sensory processing.

“If we can start by measuring a child’s brain connectivity and seeing how it is playing out in a child’s functional ability, we can then use that measure as a metric for success in our interventions and see if the connectivities are changing based on our clinical interventions,” said Marco. “Larger studies to replicate this early work are clearly needed but we are encouraged that DTI can be a powerful clinical and research tool for understanding the basis for sensory neurodevelopmental differences.”

Dedicated Octopus Mother Keeps Watch Over Her Eggs For Over Four Years

redOrbit Staff & Wire Reports – Your Universe Online
A team of researchers has discovered the most dedicated mother in the entire animal kingdom: a deep-sea octopus that protected and tended to her eggs for a period for 4 1/2 years until her offspring finally hatched.
In research published Wednesday in the open-access journal PLOS ONE, the study authors explain that an octopus typically has one single reproductive period during its lifetime, and while females tend to keep watch over their eggs until they hatch, the process usually takes no more than one to three months for most shallow-water octopus species.
On the other hand, deep water octopi are another story, as experts know little about their egg brooding practices. To learn more about these habits, Bruce Robison of the Monterey Bay Aquarium Research Institute (MBARI) and his colleagues used a remotely operated vehicle to monitor the Monterey Submarine Canyon off the coast of central California, where they discovered a deep-sea octopus on the seafloor at a depth of 4,600 feet in 2007.
“Over the next four and one-half years, the researchers dove at this same site 18 times,” MBARI explained in a statement. “Each time, they found the same octopus, which they could identify by her distinctive scars, in the same place.”
“As the years passed, her translucent eggs grew larger and the researchers could see young octopuses developing inside. Over the same period, the female gradually lost weight and her skin became loose and pale,” the Institute added. “The researchers never saw the female leave her eggs or eat anything. She did not even show interest in small crabs and shrimp that crawled or swam by, as long as they did not bother her eggs.”
The female octopus, Graneledone boreopacifica, spent the time clinging tight to a vertical rock face near a canyon floor, keeping watch over her roughly 160 translucent eggs, Reuters reporter Will Dunham said. The creature progressively lost weight and its skin grew pale over the course of the 53 month observation period (May 2007 to September 2011).
“It’s extraordinary. It’s amazing. We’re still astonished ourselves by what we saw,” Robison told Dunham. “She was protecting her eggs from predators, and they are abundant. There are fish and crabs and all sorts of critters that would love to get in there and eat those eggs. So she was pushing them away when they approached her.”
“The first time that we… realized that she had gone up and laid a clutch of eggs, it was very exciting,” he added in an interview with BBC News. “Each time we went down it was more of a surprise, because we found her there again and again and again, past the point that anybody expected she’d persist. It got to be like a sports team we were rooting for. We wanted her to survive and to succeed.”
The creature’s amazing feat is a record gestation period amongst all animal species, noted Fred Barbash of The Washington Post, and the 53 month egg-brooding period shatters the previous octopus record of just 14 months. The primary question remaining, the study authors wrote, is to determine exactly how the mother was able to survive so long without food.
“The study team never witnessed the mother feeding, but they observed only about 18 hours of a 53-month brooding cycle. The scientists even offered her crab, but she didn’t take the bait,” said Amy West of National Geographic. “Robison surmised she might have occasionally eaten small crabs in defense of her eggs, a theory based on carcasses found close by. But one thing for sure is that Robison and his team found an invertebrate making the ultimate sacrifice to care for its young.”