New Mite Species Discovered In Caribbean Mesophotic Coral Ecosystem

Pensoft Publishers

During a recent survey of organisms collected from Bajo de Sico, a mesophotic coral reef ecosystem in Mona Passage off Puerto Rico, one pontarachnid mite species new to science was discovered. The new species was named after the famous Puerto Rican singer Jennifer Lopez. The study was published in the open access journal ZooKeys.

“The reason behind the unusual choice of name for the new species,” explains the lead author Vladimir Pešić, Department of Biology, University of Montenegro, “is that J.Lo’s songs and videos kept the team in a continuous good mood when writing the manuscript and watching World Cup Soccer 2014.”

Pontarachnid mites represent widely distributed but still unstudied group of marine animals. Nothing is known about the life cycle of these animals. The new mite species was collected from nearly 70 m depth, the greatest depth from which Pontarachnid mites have been found until now.

Mesophotic coral ecosystems (MCEs), like Bajo de Sico where the new species was found are light-dependent habitats dominated by macroalgae, sponges and scleractinian corals and are found on the insular and continental slopes of Caribbean islands between 30 and 100 m. Even at the lower depth range (70-100 m), there is enough light for photosynthesis to take place enhancing the growth of several scleractinian coral species and algae.

The MCEs of Puerto Rico represent a potential biodiversity hotspot for marine arthropods.

Bothered By Hot Flashes? Acupuncture Might Be The Answer

The North American Menopause Society (NAMS)

New meta-analysis shows benefits of ancient Chinese method on today’s menopausal hot flashes

In the 2,500+ years that have passed since acupuncture was first used by the ancient Chinese, it has been used to treat a number of physical, mental and emotional conditions including nausea and vomiting, stroke rehabilitation, headaches, menstrual cramps, asthma, carpal tunnel, fibromyalgia and osteoarthritis, to name just a few. Now, a meta-analysis of randomized controlled trials which is being published this month in Menopause, the journal of The North American Menopause Society (NAMS), indicates that acupuncture can affect the severity and frequency of hot flashes for women in natural menopause.

An extensive search of previous studies evaluating the effectiveness of acupuncture uncovered 104 relevant students, of which 12 studies with 869 participants met the specified inclusion criteria to be included in this current study. While the studies provided inconsistent findings on the effects of acupuncture on other menopause-related symptoms such as sleep problems, mood disturbances and sexual problems, they did conclude that acupuncture positively impacted both the frequency and severity of hot flashes.

Women experiencing natural menopause and aged between 40 and 60 years were included in the analysis, which evaluated the effects of various forms of acupuncture, including traditional Chinese medicine acupuncture (TCMA), acupressure, electroacupuncture, laser acupuncture and ear acupuncture.

Interestingly, neither the effect on hot flash frequency or severity appeared to be linked to the number of treatment doses, number of sessions or duration of treatment. However, the findings showed that sham acupuncture could induce a treatment effect comparable with that of true acupuncture for the reduction of hot flash frequency. The effects on hot flashes were shown to be maintained for as long as three months.

Although the study stopped short of explaining the exact mechanism underlying the effects of acupuncture on hot flashes, a theory was proposed to suggest that acupuncture caused a reduction in the concentration of β-endorphin in the hypothalamus, resulting from low concentrations of estrogen. These lower levels could trigger the release of CGRP, which affects thermoregulation.

“More than anything, this review indicates that there is still much to be learned relative to the causes and treatments of menopausal hot flashes,” says NAMS executive director Margery Gass, MD. “The review suggests that acupuncture may be an effective alternative for reducing hot flashes, especially for those women seeking non- pharmacologic therapies.”

A recent review indicated that approximately half of women experiencing menopause-associated symptoms use complementary and alternative medicine therapy, instead of pharmacologic therapies, for managing their menopausal symptoms.

The article, “Effects of acupuncture on menopause-related symptoms and quality of life in women on natural menopause: a meta-analysis of randomized controlled trials”, will be published in the February 2015 print edition of Menopause. The meta-analysis was supported by grants from the Ministry of Science and Technology of Taiwan.

—–

SHOP NOW: Dropcam Pro Wi-Fi Wireless Video Monitoring Camera

Babies’ Brains Rehearse Speech Mechanics Months Before Their First Words

Molly McElroy, University of Washington

Infants can tell the difference between sounds of all languages until about 8 months of age when their brains start to focus only on the sounds they hear around them. It’s been unclear how this transition occurs, but social interactions and caregivers’ use of exaggerated “parentese” style of speech seem to help.

University of Washington research in 7- and 11-month-old infants shows that speech sounds stimulate areas of the brain that coordinate and plan motor movements for speech.

The study, published July 14 in the Proceedings of the National Academy of Sciences, suggests that baby brains start laying down the groundwork of how to form words long before they actually begin to speak, and this may affect the developmental transition.

“Most babies babble by 7 months, but don’t utter their first words until after their first birthdays,” said lead author Patricia Kuhl, who is the co-director of the UW’s Institute for Learning and Brain Sciences. “Finding activation in motor areas of the brain when infants are simply listening is significant, because it means the baby brain is engaged in trying to talk back right from the start and suggests that 7-month-olds’ brains are already trying to figure out how to make the right movements that will produce words.”

Kuhl and her research team believe this practice at motor planning contributes to the transition when infants become more sensitive to their native language.

The results emphasize the importance of talking to kids during social interactions even if they aren’t talking back yet.

“Hearing us talk exercises the action areas of infants’ brains, going beyond what we thought happens when we talk to them,” Kuhl said. “Infants’ brains are preparing them to act on the world by practicing how to speak before they actually say a word.”

In the experiment, infants sat in a brain scanner that measures brain activation through a noninvasive technique called magnetoencephalography. Nicknamed MEG, the brain scanner resembles an egg-shaped vintage hair dryer and is completely safe for infants. The Institute for Learning and Brain Sciences was the first in the world to use such a tool to study babies while they engaged in a task.

Here’s a video of one the babies in the experiment.

The babies, 57 7- and 11- or 12-month-olds, each listened to a series of native and foreign language syllables such as “da” and “ta” as researchers recorded brain responses. They listened to sounds in English and in Spanish.

The researchers observed brain activity in an auditory area of the brain called the superior temporal gyrus, as well as in Broca’s area and the cerebellum, cortical regions responsible for planning the motor movements required for producing speech.

This pattern of brain activation occurred for sounds in the 7-month-olds’ native language (English) as well as in a non-native language (Spanish), showing that at this early age infants are responding to all speech sounds, whether or not they have heard the sounds before.

In the older infants, brain activation was different. By 11-12 months, infants’ brains increase motor activation to the non-native speech sounds relative to native speech, which the researchers interpret as showing that it takes more effort for the baby brain to predict which movements create non-native speech. This reflects an effect of experience between 7 and 11 months, and suggests that activation in motor brain areas is contributing to the transition in early speech perception.

The study has social implications, suggesting that the slow and exaggerated parentese speech – “Hiiiii! How are youuuuu?” – may actually prompt infants to try to synthesize utterances themselves and imitate what they heard, uttering something like “Ahhh bah bah baaah.”

“Parentese is very exaggerated, and when infants hear it, their brains may find it easier to model the motor movements necessary to speak,” Kuhl said.

Labeling People With Pre-diabetes Is Unhelpful And Unnecessary

University College London

Labeling people with moderately high blood sugar as pre-diabetic is a drastically premature measure with no medical value and huge financial and social costs, say researchers from UCL and the Mayo Clinic, Minnesota.

The analysis, published in the BMJ, considered whether a diagnosis of pre-diabetes carried any health benefits such as improved diabetes prevention. The authors showed that treatments to reduce blood sugar only delayed the onset of type 2 diabetes by a few years, and found no evidence of long-term health benefits.

Type 2 diabetes is typically diagnosed with a blood test that measures levels of hemoglobin A1c, which indicates average blood sugar level over the last three months. People with an A1c over 6.5% can be diagnosed with diabetes but the latest guidelines from the American Diabetes Association (ADA) define anyone with an A1c between 5.7% and 6.4% as having pre-diabetes.

If the ADA guidelines were adopted worldwide, a third of the UK adult population and more than half of adults in China would be diagnosed with pre-diabetes. The latest study questions the logic of putting a label on such huge sections of the population, as it could create significant burdens on healthcare systems without conferring any health benefits. Previous research has shown that type 2 diabetes treatments can do more harm than good for people with A1c levels around 6.5%, let alone people below this level.

3.2 million people in the UK are currently diagnosed with type 2 diabetes, but approximately 16 million would fall into the ADA’s pre-diabetes category. There is a condition known as impaired glucose tolerance (IGT) that affects around 3.7 million adults in the UK (8%). People with IGT are at high risk of diabetes, but the test is more time-consuming than a simple A1c blood test. There is evidence to suggest that interventions can delay the progression of IGT into diabetes, but the ADA category of pre-diabetes also includes another 12 million people who are at a much lower risk of progressing to diabetes, for whom any benefit from treatment is unknown.

The World Health Organization (WHO) has stated that “use of ‘pre-diabetes’ is discouraged to avoid any stigma associated with the word diabetes and the fact that many people do not progress to diabetes as the term implies.” Guidance from the UK National Institute for Health and Care Excellence (NICE) broadly aligns with the WHO statement, looking to “move away from describing ‘pre-diabetes’ as a separate condition”. So in the way of official authoritative organizations, ADA is pretty much on its own in using this term. Yet it has caught on heavily in the global scientific literature and because of ethnic differences in A1c levels, it may be an even less valid category in other countries and demographics.

“Pre-diabetes is an artificial category with virtually zero clinical relevance,” says lead author John S Yudkin, Emeritus Professor of Medicine at UCL. “There is no proven benefit of giving diabetes treatment drugs to people in this category before they develop diabetes, particularly since many of them would not go on to develop diabetes anyway.

“Sensibly, the WHO and NICE and the International Diabetes Federation do not recognise pre-diabetes at present but I am concerned about the rising influence of the term. It has been used in many scientific papers across the world, and has been applied to a third of adults in the UK and half of those in China. We need to stop looking at this as a clinical problem with pharmaceutical solutions and focus on improving public health. The whole population would benefit from a more healthy diet and more physical activity, so it makes no sense to single out so many people and tell them that they have a disease.”

Previous studies have tested the effectiveness of giving people with IGT a drug called metformin, which is used to lower blood sugar in people with diabetes. The drug reduced the risk of developing diabetes by 31% over 2.8 years, probably by delaying its onset rather than by completely halting its development. But people who go on to develop diabetes are often treated with metformin anyway and there is no evidence of long-term benefits to starting the treatment early.

“The ADA recommends treating pre-diabetes with metformin, but the majority of people would receive absolutely no benefit,” explains Professor Yudkin. “There are significant financial, social and emotional costs involved with labelling and treating people in this way. And a range of newer and more expensive drugs are being explored as treatments for ‘pre-diabetes.’ The main beneficiaries of such recommendations would be the drug manufacturers, whose available market suddenly leaps to include significant swathes of the population. This is particularly true in emerging economies such as China and India, where regulating the healthcare market is a significant challenge.”

“Healthy diet and physical activity remain the best ways to prevent and to tackle diabetes,” says co-author Victor Montori, Professor of Medicine at the Mayo Clinic, Rochester, Minnesota, USA. “Unlike drugs they are associated with incredibly positive effects in other aspects of life. We need to keep making efforts to increase the overall health of the population, by measures involving public policy rather than by labelling large sub-sections of the population as having an illness. This is a not a problem to be solved at the bedside or in the doctor’s surgery, but rather by communities committed to the health of their citizens.”

Link Between Parkinson’s Disease And Creativity: Study

Tel Aviv University

New TAU study confirms creative energy in Parkinson’s sufferers is greater than in healthy individuals

Prof. Rivka Inzelberg of Tel Aviv University’s Sackler Faculty of Medicine and the Sagol Neuroscience Center at Sheba Medical Center, Tel Hashomer, documented the exceptional creativity of Parkinson’s patients two years ago in a review for Behavioral Neuroscience. Since then, she has conducted the first empirical study to verify a link between Parkinson’s disease and artistic inclination.

That empirical study, now published in the Annals of Neurology, definitively demonstrates that Parkinson’s patients are more creative than their healthy peers, and that those patients taking higher doses of medication are more artistic than their less-medicated counterparts.

“It began with my observation that Parkinson’s patients have a special interest in art and have creative hobbies incompatible with their physical limitations,” said Prof. Inzelberg. “In my last paper, I reviewed case studies from around the world and found them to be consistent. In my present research, we conducted the first comprehensive study to measure the creative thinking of Parkinson’s patients. This was not a simple task, because how does one measure, or quantify, creativity? We had to think creatively ourselves.”

Measuring artistic creativity

Prof. Inzelberg and a team of researchers from TAU, the Sheba Medical Center, and Bar-Ilan University conducted a full battery of tests on 27 Parkinson’s patients treated with anti-Parkinson’s drugs and 27 age- and education-matched healthy controls. Some of the tests were well-known and others newly adapted for the purpose of the study. The tests included the Verbal Fluency exam, in which a person is asked to mention as many different words beginning with a certain letter and in a certain category (fruit, for example) as possible.

The participants were then asked to undergo a more challenging Remote Association Test, in which they had to name a fourth word (following three given words) within a fixed context. The groups also took the Tel Aviv University Creativity Test, which tested their interpretation of abstract images and assessed the imagination inherent in answers to questions like “What can you do with sandals?” The final exam was a version of the Test for a Novel Metaphor, adapted specifically for the study.

Throughout the testing, Parkinson’s patients offered more original answers and more thoughtful interpretations than their healthy counterparts.

In order to rule out the possibility that the creative process evident in the hobbies of patients was linked to obsessive compulsions like gambling and hoarding, to which many Parkinson’s patients fall prey, participants were also asked to fill out an extensive questionnaire. An analysis indicated no correlation between compulsive behavior and elevated creativity.

Express yourself

The conclusions from the second round of testing — in which the Parkinson’s participants were split into higher- and lower-medicated groups — also demonstrated a clear link between medication and creativity. Parkinson’s patients suffer from a lack of dopamine, which is associated with tremors and poor coordination. As such, they are usually treated with either synthetic precursors of dopamine or dopamine receptor agonists.

According to Prof. Inzelberg, the results are hardly surprising, because dopamine and artistry have long been connected. “We know that Van Gogh had psychotic spells, in which high levels of dopamine are secreted in the brain, and he was able to paint masterpieces during these spells — so we know there is a strong relationship between creativity and dopamine,” said Prof. Inzelberg.

Prof. Inzelberg hopes her research will be instrumental in spreading awareness. Parkinson’s patients often feel isolated by their physical limitations, so artistic work could provide a welcome outlet of expression. “After my first paper, I helped organize exhibits of patients’ paintings in Herzliya and Raanana and received feedback about similar exhibits in Canada and France,” said Prof. Inzelberg. “These exhibits were useful in raising funds for Parkinson’s research, providing occupational therapy for patients — and, most importantly, offering an opportunity for patients to fully express themselves.”

Prof. Inzelberg is currently researching additional forms of creativity in Parkinson’s patients.

—–

SHOP NOW: Trowel and Error: Over 700 Tips, Remedies and Shortcuts for the Gardener

How to effectively manage your time with fibromyalgia

Time management can be challenge for everyone, let alone people who suffer chronic pain. However, when you’re a fibromyalgia sufferer, it’s going to be even more important to be able to manage your time effectively given the limitations imposed on you by your condition. If you weren’t suffering from fibromyalgia, you might be able to get by with the standard “To Do List” or a Day-Timer. However, people with FMS need to be both more flexible and more deliberate with their scheduling.

This is because a “normal” person can use time like a commodity. It’s a staple that if used effectively, is plentiful enough for us to accomplish what we want. However, for a person with FMS, time is a luxury. This means that it can’t be spent as if there’s an abundance of it, and we just need to use it properly. You absolutely don’t have as much of it as you need, so the time that you do have will need to be used effectively. The first and most important step is to make sure that you write everything down.

First, writing things down make them feel official, and more likely that you’ll prioritize them appropriately so you can get them done. Start a Task Notebook to write down every task that you think of. Then have a Daily Task list of ten items that you want to accomplish that day. Only two of these two tasks should be difficult, and the rest should be things that you can feel confident about getting done. Finally, you’ll need to start an agenda as well, so that you can schedule and keep track of the details of how you’ll get these things done.

This is just the amount of writing and keeping track that everyone—including people who are completely healthy—need to do to keep organized. In addition to this, many people with FMS suffer from a sort of “brain fog” and so will have difficulty remembering things. Once a thing is written down, you aren’t chancing that it’ll get lost in the fog. If it’s a major priority or even if it’s not, it’s best to get it written down, so it won’t get forgotten.

Your Tasks Notebook will also provide you with the major benefit of all that writing. It also give you a place to cross them off, which is a very satisfying feeling.

Whatever style of notebook you choose for your tasks notebook should be something that you can keep close as you go through the day. When you think of something you want to get done, the last thing you want is to have to go searching for it.

When you complete an item, cross it off in your agenda. This also includes things that you delegated to someone else. They still got done, and you were an integral part of it happening. From your perspective, it was still done, so you can cross it off. Anything that is still on the list at the end of the day gets moved to the page for the next day.

Besides writing everything down, there are several tips to follow for effective time management:

How to effectively manage your time with fibromyalgia

Celebrate Your Capabilities

Make a record and concentrate your efforts on the things that you can do. Avoid concentrating too much on those things that you can’t do. Your symptoms will make sure to remind you of your limitations. Making a list of the things that you are capable of will put you in the right mindset to be motivated for the rest of your time management regime.

Accept Your Limitations

Conversely, you also need to be aware of the things that you can’t do. There will, no doubt, be tasks that you would very much like to be a part of, but trying to go beyond your abilities will set you up for failure and the bad feelings that come with it. If you’re unable to convert your goodwill and passion for a particular task into physical results, it will cause you a great deal of stress. Not only might you let yourself down, but you set yourself up for the guilt you can feel over letting others down.

Routine

Much of this is going to seem very regimented, and for that reason, might put some people off. However, FMS sufferers can use regimented routines to keep themselves on track. Having a plan for every day is an important way to make sure that days don’t get away from you before you’ve had a chance to accomplish something with them.

Determine Your Prime Times

This will also affect your schedule. Almost everyone is a “morning person,” or a “night owl.” All this means is that everyone has times of day that are more productive. However, for folks who suffer from FMS, this will mean more than just feeling alert or not.

There are going to be times of day when your symptoms will be worse or better. Only time lets you know when those times will be. After you’ve learned what times those are, you will be able to work with them to improve your schedule and make it more likely that you can accomplish the things you intend. There are times of day when pain and other symptoms are overwhelming and it will require more effort on your part to get things done. However, there will also be times of day when you know from experience will be better for certain tasks.

Be Flexible

The temptation with writing everything down is not to let yourself get off schedule. However, the realities of coping with Fibromyalgia, or any other disabling condition is to accept that there will be days when you simply cannot accomplish what you’d hoped to do that day. It’s not procrastination to recognize when you’re just not physically up for performing a particular task. Keep track of tasks that you are unable to do, and then move forward to another task.

Rules

It’s also helpful to make yourself some rules to go by. Either these rules might be designed to manage symptoms effectively, codifying limitations you know you have even when they don’t seem to be a problem. Or they might be rules that you use to deal with certain situations. You might have rules to tell you when to rest or when it’s time to get off the computer. Either way, the rules are an effective way to help with scheduling.

Learn to Say “No”

You need to be able to let go of guilt about things you can’t do. Many people overextend themselves, but for people with FMS, it is very important to appreciate the limitations caused by your symptoms. Oddly, the syndrome seems to be most common in people who have the type A personality, which can make it even more difficult for them to say “no.”

It’s also important to recognize that you don’t need to feel guilty, provide an excuse or get defensive. Just be matter-of-fact about it. Other people may not understand what it’s like to live with pain, and there’s no point in letting it affect you or your decisions.

Learn to Delegate

Another skill that can be difficult to put into practice, like saying no. However, it’s important that you distribute your work instead of just getting discouraged because you can’t do everything anymore. If you want to assuage the guilt you might feel about it, barter chores for favours that you can do for others. For example, if baking is your thing, it’s easy enough to trade baking for labour. Bake cookies to a neighbor kid to take the garbage out for you.

Establish Your Priorities

You need to accept that you can’t “do it all.” Take comfort in the fact that nobody else can, either. And that is people who don’t necessarily suffer anything more serious than chronic procrastination. And realize that you might not be as productive as these people.

Try to use as much of the limited amount of energy that you have to do things that will give you joy. So go back to the list that you made at the beginning to look again at what capabilities you want to celebrate. Tasks that involve those capabilities and skills will not only have a better chance of getting done without frustration, but they will also provide you with a greater sense of satisfaction when they’re completed.

As we’ve already pointed out, when you are suffering from FMS, time is no longer a commodity but a luxury. You need to think carefully about how you plan on using it. You are no longer in a situation where you can just try to avoid procrastination and kick yourself about it when you don’t. You need to use your time effectively to give yourself a lift and make your use of time provide you with joy.

Further Reading

“Managing Time/Energy with Fibromyalgia & Chronic Fatigue Syndrome.” By Adrienne Dellwo.  About.com. http://chronicfatigue.about.com/b/2013/10/21/managing-timeenergy-with-fibromyalgia-chronic-fatigue-syndrome.htm.

“Routine, Reminders, Rules.” Treating Chronic Fatigue Syndrome and Fibromyalgia: An Integrated Approach. http://www.treatcfsfm.org/submnu-Routine,-Reminders,-Rules-103.html.

“Time Management for Those with Chronic Fatigue Syndrome, Fibromyalgia, & Other Disabling Conditions.” by Pamela Rice Hahn. http://www.chronic-illness.org/blog/time-management-for-those-with-disabilities.

The History of Mobile Phone Technology

Smartphones and feature phones are as common now as traditional landline phones were for decades. These handheld devices are so popular that many homes now only use mobile phones, increasingly pushing landline devices into the obsolete category. But while the popularity of mobile connectivity is vast today, it is still a very young technology when compared to its landline counterparts, which have been in existence since the mid-1800s.

To be clear, the history of the mobile phone focuses on devices that connect wirelessly to the public switched telephone network. And while it is a fairly young technology, its history can in fact be traced back more than a hundred years. Yet, the first mobile phones were barely portable compared to today’s definition of the term mobile technology.

Before the first truly mobile phones existed, there are some precursors to this technology that require mention, as they have undoubtedly led to the rise of the mobile phone as we know it today.

EARLY ERA

The first so-called claim of a wireless device came in 1908, when Prof. Albert Jahnke and the Oakland Transcontinental Aerial Telephone and Power Company said they had developed a wireless telephone. However, they were quickly accused of fraud and while the charge was later dropped, production of the device never ensued.

Ten years later, in 1918, the German railroad system tested its own wireless telephone system on military trains between Berlin and Zossen. By 1924, public trials began with telephone connection on trains between Berlin and Hamburg. A year later, in 1924, Zugtelephonie A.G. was founded to supply train telephony equipment with the first telephone systems being approved for use in postal and other trains by 1926.

By World War II, radio telephony was being implemented for military use, with hand-held radio transceivers being available since the 1940s. The first mobile telephones for automobiles also came out in the 1940s. These early devices, however, were bulky, heavy and consumed a lot of power. As well, the network for these devices only supported limited simultaneous conversations.

THE CAR PHONE

The first truly mobile phone service came to light on June 17, 1946 under Bell Labs, which developed mobile phones that allowed users to place and receive phone calls from their automobiles. Shortly thereafter, AT&T offered the first Mobile Telephone Service, but the technology was primitive and only offered limited coverage area with a few available channels in urban regions.

The later development of cellular technology would catapult mobile telephony into a new era, allowing for widespread adoption of mobile phones. This era was predicted by Arthur C. Clarke in a 1959 essay, where he envisioned a world where every person could make calls with their very own personal transceiver. In the essay, Clarke wrote: “The time will come when we will be able to call a person anywhere on Earth merely by dialing a number.” His vision also included a means for a global positioning system that would ensure that “no one need ever again be lost.” He later predicted the advent of such a device taking place in the 1980s.

Russian engineer Leonid Kupriyanovich developed a number of experimental mobile phone models between 1957 and 1961, with the last of the designs weighing less than three ounces and could fit easily in the palm of one’s hand, according to media sources of the time in USSR.

The first fully automated vehicular mobile phone system was unveiled in Sweden in 1956. Named MTA (Mobiltelefonisystem A), it allowed calls to be made and received using a rotary dial. These early car phones were primitive and consisted of bulky vacuum tubes and relays, weighing as much as 90 pounds. In 1962, MTB was unveiled. This system used push-button calling and improved operational reliability with transistors and DTMF signaling.

Other car phone systems were implemented around this time as well. USSR introduced the first mobile phone for motorists in 1958 and the first US car phones were introduced in 1959.

In 1965, a mobile automatic phone was unveiled by Bulgarian-based Radioelektronika, who presented the device at the Inforga-65 international exhibition in Moscow. The phone was based on the system developed by Kupriyanovich. The phone system worked with a base station, and one base station could provide service for up to 15 customers.

AT&T introduced the first major improvements to mobile telephony in 1965, giving its nearly two-decade old MTS system a new and improved name – Improved Mobile Telephone Service (IMTS). The new system utilized additional radio channels, allowing for more simultaneous calls in a given geographical region. The company also introduced customer dialing, eliminating a call to be manually placed by an operator, and also reduced the size and weight of the user equipment. Despite the capacity improvement, demand outpaced capacity. AT&T signed an agreement with state regulators to limit its service to just 40,000 customers. In NYC, 2,000 customers had to share just 12 radio channels and had to wait on average 30 minutes to place a call.

Some improvements were made in the late-1960s by independent telephone companies, introducing Radio Common Carrier service that paired UHF and VHF frequencies near those used by IMTS to improve capacity. This service was provided until the 1980s when cellular AMPS systems made RCC equipment obsolete.

Another expanding technology in phone mobility was the emergence of the satellite phone, which was first developed in 1979 for use on the high seas. That technology is now also useful on land in areas that are out of reach from landline, cellular and marine VHF radio stations. An updated Iridium satellite constellation system was set up beginning in 1998, and while the company behind the system went bankrupt, satellite telephony under that system is still available today.

THE ADVENT OF HANDHELD AND CELLPHONES

Prior to 1973, mobile telephony was limited to phones installed in cars, trains and other vehicles. Motorola was the first company to produce a handheld mobile phone.

The first mobile phone call from a handheld device was made on April 3, 1973 by Martin Cooper, a research executive at Motorola to Joel S Engel of Bell Labs. That first handheld phone was 10 inches long and weighed about 2.5 pounds. The prototype offered 30 minutes of talk time on a single charge and took 10 hours to recharge. John F. Mitchell, Motorola’s chief of portable communications, successfully pushed Cooper and the Motorola team to develop wireless communication products that would become smaller and lighter and could be used anywhere. The Motorola team was also instrumental in the development and design of the cellular phone.

The first automatic analog cellular systems were deployed in Tokyo in 1979, later spreading throughout Japan. The system was available throughout Nordic countries by 1981. In North America, the first analog cellular system was widely deployed in the early-1980s, being rolled out in North America in October 1983. It was deployed in Israel in 1986 and Australia in 1987. This system was deployed as the Advanced Mobile Phone System (AMPS). While the system was much more advanced than the earlier technology, it was unencrypted and easily vulnerable to eavesdropping. It was also susceptible to cell phone cloning, and the system required significant amounts of wireless spectrum to support.

NEXT GENERATION TECHNOLOGY

The first 1G (first generation) network was launched in the US on March 6, 1983 by Ameritech. It cost $100 million to develop and took a decade to reach the market. Phone talk time was limited to 30 minutes and the phones took 10 hours to charge. Despite the obvious issues, the system was in high demand with waiting lists for the system in the thousands.

The 2G network emerged in the 1990s with two systems competing for supremacy: the European GSM standard and the US-developed CDMA standard. The 2G network differed from the previous generation by using digital instead of analog transmissions and also fast out-of-band signaling. Mobile phone usage exploded in the 1990s with the onset of 2G connectivity. The advent of prepaid mobile phones also emerged during this era.

The 2G network also saw the emergence of a new variant in communication: SMS (text messaging). This technology was at first only available in GSM networks but eventually spread to all digital networks. The first machine-generate SMS message was sent in the UK on December 3, 1992 followed in 1993 by the first person-to-person SMS in Finland. The emergence of prepaid services saw a move of telephony into the hands of young people, and eventually to persons of all ages.

2G also introduced the ability to access media content on mobile phones, with the first downloadable content sold to mobile phones in 1998 – the first content available being ring tones. The first advertising on mobile phones followed shortly later in 2000, with an SMS-based headline service.

Also, the first mobile payments were trialed in 1998 in Finland and Sweden where a mobile phone was used to pay for a Coca Cola vending machine and car parking. The first commercial payments system to mimic credit cards was launched in the Philippines in 1999. The first mobile Internet service was launched in Japan in 1999.

As the 2G revolution became more widespread and people of all ages began to utilize phones in their daily lives, it became evident that demand for data was growing. This also meant that there would be a much greater demand for faster data transfers. Since 2G was not capable of handling this new era, 3G technology was unveiled in Japan in May 2001 to take on the task. The main technological difference between 2G and 3G was the use of packet switching in 3G rather than the circuit switching in 2G, which helped immensely with data transfer. With this, 3G speeds rose to 2 Mbps on average.

With the advent of 3G, competition in the market heated up with numerous companies vying for 3G space. The 2G CDMA networks were capable of transitioning to 3G to help ease some demand. With the high connection speeds of 3G technology, a transformation of the industry was at hand. For the first time, media streaming of radio and television content to 3G handsets became possible, with RealNetworks and Disney being among the early pioneers of 3G media streaming.

In the mid-2000s, 3G technology evolved into an enhanced high-speed telephony network, with new coining of the technology: 3.5G, 3G+ and turbo 3G, allowing for data transfer speeds skyrocketing up to 14 Mbps.

With the ever-increasing speeds of 3G, it was clear that the days of the traditional mobile phone/feature phone would be coming to an end. In 2007, Apple released its flagship iPhone, touted as one of the first mobile phones to use a multi-touch interface. With the concept of multi-touch displays, traditional keypads and keyboards would soon become obsolete and onscreen keypads would be the new norm.

By 2009, it was clear that, at some point, 3G networks would be overwhelmed by the growth of bandwidth-intensive applications. The industry soon focused on implementing 4G technologies, with the promise of delivering speed improvements up to 10-fold over existing 3G networks. The first 4G technologies were introduced in the US (WiMAX) and Scandinavia (LTE).

One of the main ways 4G differed from 3G was in the elimination of circuit switching, instead employing an all-IP network. With this, 4G ushered in the age of LAN and WAN networks via VoIP.

SMARTPHONES

With the onset of the 3G and 4G markets, smartphones have begun to quickly outpace feature phones in sales. While feature phones still command a large percentage of the market, smartphones with multi-touch displays and high-speed broadband connections are becoming the mainstay of modern mobility.

In the 1950s, mobile phones allowed a user to make a simple phone call on a bulky and heavy device that wasn’t truly mobile. Today, mobile technology allows users to connect nearly every aspect of their life via their smartphone, most notably with the endless barrage of mobile applications offering everything from gaming to tracking fitness and listening to music to watching movies and TV.

Today, the smartphone market is overrun with numerous companies and operating systems. Apple, Samsung, HTC and LG are some of the more notable smartphone makers vying for dominance in the market, with Apple leading the way with its iPhones. As for operating systems, Google’s Android OS is way ahead of the curve, outpacing Apple’s iOS, Microsoft’s Windows Phone OS and others in the field.

Image Caption: (left) The evolution of mobile phones (left to right: Motorola 8900X-2, Nokia 2146 orange 5.1, Nokia 3210, Nokia 3510, Nokia 6210, Ericsson T39, HTC Typhoon) Credit: Anders/Wikipedia (public domain)

Image Caption: (right) A generic smart phone model. Credit: maxkabakov/Thinkstock.com

Element Commonly Used In Apple iPads Causing Allergic Reactions

Brett Smith for redOrbit.com – Your Universe Online
While the late Steve Jobs was a notorious stickler for design details, one element of the iPad may have gotten by the mercurial tech guru – namely the element nickel, which is known to cause allergic reactions in many people.
According to a new report in the journal Pediatrics published on Monday, an 11-year-old boy was treated at a San Diego hospital recently for an allergic reaction he had to an Apple iPad. The latest report is just one in a series of cases linked to nickel in numerous tech gadgets.
Report author Dr. Sharon Jacob, a dermatologist at Rady Children’s Hospital in California, told The Associated Press that allergic reactions to nickel aren’t life-threatening, but can be extremely discomforting and may require treatment if skin reactions become infected.
The allergic reaction in the Pediatrics case initially produced scaly patches on the boy’s skin, but later led to a different kind of reaction that erupted all over his body. This second reaction didn’t respond to the usual treatment. The child’s condition was eventually determined to be a reaction to nickel and traced to a family iPad purchased in 2010.
“He used the iPad daily,” Jacob said, adding that placing a protective case around the device led to the child’s symptoms improving.
Apple spokesman Chris Gaither told the AP that the company had no comment when asked if nickel is used to make all iPad models.
According to an advisory about cellphones posted online by the Toronto-based Nickel Institute, the risk of a reaction to nickel comes from contact with the outer surfaces of tech devices “over prolonged periods of time.”
“The length of time required to elicit an allergic reaction will vary from 5 or 10 minutes to never, depending on the sensitivity of the individual,” the advisory said.
From zippers to eyeglasses, nickel is found in many everyday items and Jacob said nickel allergies are becoming either more common or progressively more recognized. She pointed to national data indicating that about 1-in-4 children who get skin tests for allergies have nickel allergies, compared to about 17 percent a decade ago.
Recent research has been pointing to support for the so-called ‘hygiene hypothesis,’ which says children are more allergic than ever due to growing up on ultra-sanitary conditions. A report published back in June found that infants who are exposed to rodent and pet dander, roach allergens and household bacteria within their first year of life are less likely to contract allergies.
—–
SHOP NOW: Kindle Fire HDX 7″, HDX Display, Wi-Fi, 16 GB – Includes Special Offers

3D-Printed Anatomy May Replace Cadavers For Medical Training

Alan McStravick for redOrbit.com – Your Universe online

Three-dimensional printing may soon replace human cadavers in the study of gross human anatomy in the world’s medical schools.

The practice of learning on medical cadavers is a rite of passage in the medical education community that is instrumental in informing students on the complex inner workings of the muscular, skeletal and circulatory systems of the human body. Medical schools rely on good-natured individuals who bequeath their bodies for the advancement of science for their supply of study cadavers.

Despite the obvious benefits of cadaver use in the education process, some medical schools are unable or unwilling to engage in the practice. This could either be because they lack the facilities for the cost-prohibitive process of properly storing the body or because the school is in a region that frowns on the practice for cultural reasons. Only a couple of years ago, these issues were insurmountable. Today, however, we have the steady march of technology to thank for the education of future physicians.

Experts from Australia’s Monash University have developed a new training kit that consists of anatomical body parts all created by the relatively new practice of 3D printing. This innovation will likely find excellent practical use in the regions detailed above as well in medical schools currently employing the use of human cadavers in their course of study.

Called the ‘3D Printed Anatomy Series,’ it is believed to be the first commercially available resource of its kind. Absolutely no human tissue is used in its production, yet the kit contains all the major parts of the body required to teach anatomy of the limbs, chest, abdomen, head and neck.

Aside from taking the place of the cadaver for intricate and intensive study, professor Paul McMenamin, Director of the university’s Center for Human Anatomy Education, sees the new kit as being a learning aid for both medical trainees and other health professionals. Additionally, use of his newly designed kit could be a significant contributor to the development of new surgical treatments.

“For centuries, cadavers bequested to medical schools have been used to teach students about human anatomy, a practice that continues today. However, many medical schools report either a shortage of cadavers, or find their handling and storage too expensive as a result of strict regulations governing where cadavers can be dissected,” McMenamin explained.

Continuing, he stated, “Without the ability to look inside the body and see the muscles, tendons, ligaments and blood vessels, it’s incredibly hard for students to understand human anatomy. We believe our version, which looks just like the real thing, will make a huge difference.”

The ‘3D Printed Anatomy Series’, created with the use of both CT and surface laser scanning, produces the body structures in either a plaster-like powder or in plastic. The plastic printing allows for a higher resolution than the parts printed in plaster. Additionally, printing in plastic allows for the accurate reproduction of colors encountered in an actual cadaver.

“Radiographic imaging, such as CT, is a really sophisticated means of capturing information in very thin layers, almost like the pages of a book,” McMenamin noted. “By taking this data and making a 3D rendered model, we can then color that model and convert that to a file format that the 3D printer uses to recreate, layer by layer, a three-dimensional body part to scale.”

The Monash University team, currently in negotiations with potential commercial partners, believes the ‘3D Printed Anatomy Series’ will go on sale later this year. The team is excited by what they believe to be several benefits their new product offers over the traditional cadaver.

“Even when cadavers are available, they’re often in short supply, are expensive and they can smell a bit unpleasant because of the embalming process,” McMenamin stated. “As a result, some people don’t feel comfortable working with them.”

McMenamin concluded, “Our 3D printed series can be produced quickly and easily, and unlike cadavers they won’t deteriorate – so they are a cost-effective option too.”

McMenamin and his team have produced and published further details on their new anatomical kit in the online version of the journal Anatomical Sciences Education.

—–

Join Amazon Student – FREE Two-Day Shipping for College Students

A High-Fat Meal Can Decrease Metabolism After A Stressful Event

[ Watch the Video: Stress, Comfort Foods Can Pack On Pounds In Women ]

Brett Smith for redOrbit.com – Your Universe Online

Feeling stressed? If you are, it could cause you to gain weight down the road.

According to a new study in the journal Biological Psychiatry, going through one or more stressful events the day prior to eating a single high-fat meal can decrease the body’s metabolism and possibly lead to weight gain.

In the study, a team of scientists surveyed 58 women on the previous day’s stresses before giving them a meal with 930 calories and 60 grams of fat. The scientists then assessed their metabolic rate via their respiration and measured blood sugar, insulin, triglycerides and the stress-related hormone cortisol.

On average, women who reported more than one stressor through the previous day burned over 100 fewer calories than non-stressed women within the seven hours that passed after consuming the high-fat meal, a change that could lead to weight gain of almost 11 pounds a year.

The stressed women also had greater levels of insulin, which plays a role in the storage of fat, and less fat oxidation, the transformation of large fat molecules into smaller molecules that can be utilized as fuel.

“This means that, over time, stressors could lead to weight gain,” said study leader Jan Kiecolt-Glaser, professor of psychiatry and psychology at The Ohio State University. “We know from other data that we’re more likely to eat the wrong foods when we’re stressed, and our data say that when we eat the wrong foods, weight gain becomes more likely because we are burning fewer calories.”

Conducted primarily at Ohio State’s Clinical Research Center, the study included women with an average age of 53. During their initial visit, women were surveyed and given three standardized meals to establish a dietary standard for 24 hours. Volunteers were told to fast for 12 hours before reporting for the next visit.

Upon the subsequent admission, women were surveyed about the previous day’s stress. The researchers found that 31 respondents said they had at least one previous day stressor on one visit and 21 cited stressors at both visits. Stressors included arguments, trouble with their children or work-related pressure. Six women didn’t report any stressor at either visit.

During the second appointment, some participants were given a meal of eggs, turkey sausage, biscuits and gravy, which they were told to eat within 20 minutes.

“This is not an extraordinary meal compared to what many of us would grab when we’re in a hurry and out getting some food,” Kiecolt-Glaser said.

For a control comparison to the meal high in saturated fat, other participants were given a meal high in monounsaturated fat, which is linked to a range of health benefits.

“We suspected that the saturated fat would have a worse impact on metabolism in women, but in our findings, both high-fat meals consistently showed the same results in terms of how stressors could affect their energy expenditure,” said study author Martha Belury, professor of human nutrition at Ohio State.

The researchers found insulin in stressed women shot up right after the high-fat meal was eaten and then dropped to levels seen in non-stressed women after another 90 minutes. They also saw a link between a history of depression in some volunteers and a quicker jump in triglycerides after the meal.

“With depression, we found there was an additional layer. In women who had stress the day before and a history of depression, triglycerides after the meal peaked the highest,” Kiecolt-Glaser said. “The double whammy of past depression as well as daily stressors was a really bad combination.”

While stress and depression may be difficult factors to manage, the study team said their findings point to a need to keep healthy food options nearby during mentally strenuous periods.

“We know we can’t always avoid stressors in our life, but one thing we can do to prepare for that is to have healthy food choices in our refrigerators and cabinets so that when those stressors come up, we can reach for something healthy rather than going to a very convenient but high-fat choice,” Belury said.

—–

GET FIT WITH FITBIT – Fitbit Flex Wireless Activity + Sleep Wristband, Black

Evidence Of Super-fast Deep Earthquake Discovered By Scripps Scientists

University of California – San Diego

Rare high-speed rupture off Russia provides clues about similar phenomena on shallow fault zones near Earth’s surface

As scientists learn more about earthquakes that rupture at fault zones near the planet’s surface—and the mechanisms that trigger them—an even more intriguing earthquake mystery lies deeper in the planet.

Scientists at Scripps Institution of Oceanography at UC San Diego have discovered the first evidence that deep earthquakes, those breaking at more than 400 kilometers (250 miles) below Earth’s surface, can rupture much faster than ordinary earthquakes. The finding gives seismologists new clues about the forces behind deep earthquakes as well as fast-breaking earthquakes that strike near the surface.

Seismologists have documented a handful of these events, in which an earthquake’s rupture travels faster than the shear waves of seismic energy that it radiates. These “supershear” earthquakes have rupture speeds of four kilometers per second (an astonishing 9,000 miles per hour) or more.

In a National Science Foundation-funded study reported in the June 11, 2014, issue of the journal Science, Scripps geophysicists Zhongwen Zhan and Peter Shearer of Scripps, along with their colleagues at Caltech, discovered the first deep supershear earthquake while examining the aftershocks of a magnitude 8.3 earthquake on May 24, 2013, in the Sea of Okhotsk off the Russian mainland.

Details of a magnitude 6.7 aftershock of the event captured Zhan’s attention. Analyzing data from the IRIS (Incorporated Research Institutions for Seismology) consortium, which coordinates a global network of seismological instruments, Zhan noted that most seismometers around the world yielded similar records, all suggesting an anomalously short duration for a magnitude 6.7 earthquake.

Data from one seismometer, however, stationed closest to the event in Russia’s Kamchatka Peninsula, told a different story with intriguing details.

After closely analyzing the data, Zhan not only found that the aftershock ruptured extremely deeply at 640 kilometers (400 miles) below the earth’s surface, but its rupture velocity was extraordinary—about eight kilometers per second (five miles per second), nearly 50 percent faster than the shear wave velocity at that depth.

“For a 6.7 earthquake you would expect a duration of seven to eight seconds, but this one lasted just two seconds,” said Shearer, a geophysics professor in the Cecil H. and Ida M. Green Institute of Geophysics and Planetary Physics (IGPP) at Scripps. “This is the first definitive example of supershear rupture for a deep earthquake since previously supershear ruptures have been documented only for shallow earthquakes.”

“This finding will help us understand why deep earthquakes happen,” said Zhan. “One quarter of earthquakes occur at large depths, and some of these can be pretty big, but we still don’t understand why they happen. So this earthquake provides a new observation for deep earthquakes and high-rupture speeds.”

Zhan also believes the new information will be useful in examining ultra-fast earthquakes and their potential for impacting fault zones near the earth’s surface. Although not of supershear caliber, California’s destructive 1994 Northridge earthquake had a comparable size and geometry to that of the 6.7 Sea of Okhotsk aftershock.

“If a shallow earthquake such as Northridge goes supershear, it could cause even more shaking and possibly more damage,” said Zhan.

Creating Insecticides To Target Specific Pests Without Harming Beneficial Species

Johns Hopkins Medicine

Development raises possibility of more species-specific insecticides

Using spider toxins to study the proteins that let nerve cells send out electrical signals, Johns Hopkins researchers say they have stumbled upon a biological tactic that may offer a new way to protect crops from insect plagues in a safe and environmentally responsible way.

Their finding—that naturally occurring insect toxins can be lethal for one species and harmless for a closely related one—suggests that insecticides can be designed to target specific pests without harming beneficial species like bees. A summary of the research, led by Frank Bosmans, Ph.D., an assistant professor of physiology at the Johns Hopkins University School of Medicine, will be published July 11 in the journal Nature Communications.

“Most insecticides used today take a carpet-bombing approach, killing indiscriminately and sometimes even hurting humans and other animals,” says Bosmans. “The more specific a toxin’s target, the less dangerous it is for everything else.”

Their finding began with the mistaken inclusion of a protein, called Dc1a, in a shipment sent by the team’s Australian collaborators. The protein was extracted from the venom of the desert bush spider Diguetia canities, which lives in the deserts of the southwestern United States and Mexico and is harmless to humans.

When Bosmans’ Australian collaborators tested the impact of Dc1a on proteins from American cockroaches, the proteins reacted very weakly, so they hadn’t planned on sending Dc1a to Bosmans for further study. But it was accidentally included with other spider venom proteins for Bosmans’ group to test, says Bosmans, so his laboratory did so.

The Bosmans lab studies proteins called sodium channels, which are found in the outer envelope of nerve cells throughout the body. Stimuli, like the acute pressure of hitting your finger with a hammer, are communicated to the proteins, causing them to open their pores so that sodium flows in. The positive charge of sodium causes an electrical signal to be sent down the nerve, eventually reaching the spinal cord and brain so the body can react.

“Sodium channels are the fastest ion channels in the human body and are needed to experience nearly every sensation, so mutations in them can lead to severe disorders of the nerves, muscles and heart,” Bosmans says. That makes them a critical target for scientific study.

To understand the channels better, Bosmans and his team insert the protein’s gene into frog eggs, which are large and easy to study. They can then use electrodes to monitor the flow of sodium into the cells. Adding spider toxins that interfere with the function of the channels sheds light on the channels’ activity, since different toxins inhibit different parts of the protein, causing different effects. In addition to testing human sodium channels, the team sometimes works with sodium channels from insects.

Because his laboratory recently acquired the gene for the German cockroach sodium channel, Bosmans’ team tested Dc1a on the protein and saw a startling increase in the channels’ activity. “Sodium poured into the cells. In a bug, that would cause massive seizures, much like being electrocuted,” says Bosmans. “Luckily, the toxin doesn’t act on human sodium channels.”

Curious about the difference between the two cockroach species’ channels, they first identified the region of the channel that the toxin targets, but it turned out to be exactly the same in the two bugs. Digging deeper, they found a region nearby that differed by just two amino acids, the basic building blocks of the proteins. When mutations were made in the German version so that its amino acids were the same as the American version’s, the German cockroach sodium channel reacted like the American one.

The team’s next step is to test the toxin on other insect species to determine its full range. Now that they know how important this region of sodium channels is, Bosmans says, researchers will know to look for mutations there as they try to find the mechanism for various human disorders. It may also be possible to create drugs that block access to the site in overactive sodium channels.

Pornography Found To Have Addiction-Like Effect On Brains Of Hypersexual People

redOrbit Staff & Wire Reports – Your Universe Online

When viewing pornographic material, people with compulsive sexual behavior experience brain activity similar to that experienced by drug addicts, according to a University of Cambridge study appearing in the July 11 edition of the journal PLOS ONE.

The results of the study lend evidence that the condition, which is also known as hypersexuality or sex addiction, should be viewed as a legitimate mental health disorder and considered for inclusion in the American Psychological Association’s handbook of mental-health disorders (the DSM-5), explained Live Science staff writer Jillian Rose Lim.

While the researchers themselves advise that their findings do not necessarily indicate that pornography itself is addictive, as there are observable differences in the in the brains of patients who have been diagnosed with compulsive sexual behavior and those who have not, the study does indicate that members of the former category experienced heightened activity in the same regions of the brain affected when addicts use drugs.

According to BBC News, the Cambridge researchers used functional magnetic resonance imaging (fMRI) to scan the brains of 19 male volunteers while they were viewing pornographic videos, all of whom were described as being obsessed with sexual thoughts and behaviors.

When compared to the brains of healthy individuals, the study authors found that so-called sex addicts had higher levels of activity in three parts of the brain: the ventral striatum, dorsal anterior cingulate and the amygdala, said James Gallagher, health editor of the BBC website. Furthermore, the patients were found to have started watching pornography at earlier ages and in higher proportions than the healthy volunteers.

“The patients in our trial were all people who had substantial difficulties controlling their sexual behavior and this was having significant consequences for them, affecting their lives and relationships,” corresponding author Dr. Valerie Voon explained in a statement. “In many ways, they show similarities in their behavior to patients with drug addictions. We wanted to see if these similarities were reflected in brain activity, too.”

“This is the first study to look at people suffering from these disorders and look at their brain activity, but I don’t think we understand enough right now to say it is clearly an addiction,” she added in an interview with Gallagher. “We don’t know if some of these effects are predispositions, meaning that if you have greater activity in these areas are you more likely to develop these behaviors or if it is an effect of the pornography itself – it’s very difficult to tell.”

UCLA psychologist Rory Reid, who was not involved in the study, told Lim that compulsive sexual behavior is defined as having an excessive preoccupation with sexual intercourse, and using it as a method of coping with stress or difficult experiences.

He called the Cambridge study a significant step forward in associating hypersexuality with addiction, noting that the MRI scans demonstrate that the brains of these individuals “confirm high sexual desire in regions we might expect.” However, he also pointed out that the study does not confirm whether or not sex is addictive, and if these particular individuals have such an addiction.

Additional evidence is required to answer those questions, Reid added. Likewise, Dr. Voon and her colleagues told Live Science that their findings must be replicated, and that further research will be required before compulsive sexual behavior can definitively be identified as an addiction or psychological disorder.

—–

GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now

Organic Crops Found To Contain More Nutrients, Less Pesticide Residue

redOrbit Staff & Wire Reports – Your Universe Online

An in-depth comparison of the nutritional quality and safety of food production methods has revealed that organic crops contain more antioxidants, as well as fewer and less frequent pesticide-related residue.

Lead investigator Carlo Leifert, a professor at the Newcastle University School of Agriculture in the UK, and an international team of colleagues examined 343 peer-reviewed publications that compared the nutritional quality and safety of both organic and conventionally grown fruits, vegetables and grains.

The study authors conducted a meta-analysis of techniques to quantify differences between the two different types of food production, and most of the papers they reviewed looked at crops that had been grown in the same region using the same soil – reducing the amount of variables in terms of nutritional content and safety parameters.

Leifert’s team discovered there were multiple nutritional benefits associated with the methods used to produce crops. For instance, they said that a plant grown on a conventionally-managed field would typically be exposed to high levels of synthetic nitrogen, and would convert the additional resources into sugar and starch production.

When that occurs, the harvested portion of the plant is likely to contain lower amounts of antioxidants and other beneficial nutrients. In all, organic crops typically had between 18 percent and 69 percent higher antioxidant concentrations, and people who consume this type of food exclusively would receive an extra 20 percent to 40 percent of these nutrients – roughly equal to two additional daily servings of fruits or veggies.

[ Video: Major Study Documents Nutritional And Food Safety Benefits Of Organic Farming ]

In addition to learning there were nutritional and food safety benefits to organic farming. Leifert’s team also found that the quality and reliability of comparison studies on the topic had vastly improved over the past few years, leading to the discovery of significant differences in both categories that had gone undetected by earlier studies.

For instance, the meta-analysis included the results of a 2010 Washington State University project which compared conventional and organic strawberries grown in California. In a statement, the author of that study, John Reganold, said it was “impressive” and that the “major nutritional findings” were “similar to those reported” in his paper.

The study also found that, without being exposed to chemical pesticides, organic plants and crops tended to produce a greater amount of phenols and polyphenols in order to defend themselves from pest attacks. When consumed by people, phenols and polyphenols can help prevent coronary heart disease, stroke, some forms of cancer and other ailments that are at least partially the result of oxidative damage, the study authors explained.

Furthermore, pesticide residues were reportedly up to four times more likely in conventional foods than in organic ones, as the latter method does not allow synthetic pesticides to be applied to crops. Crops harvested from organically managed fields were not free of pesticides. They did occasionally contain residue, but at levels between 10-fold to 100-fold lower than the corresponding type of conventionally-grown fruits, vegetables or grains.

“In a surprising finding, the team concluded that conventional crops had roughly twice as much cadmium, a toxic heavy metal contaminant, as organic crops,” WSU explained. “The leading explanation is that certain fertilizers approved for use only on conventional farms somehow make cadmium more available to plant roots. A doubling of cadmium from food could push some individuals over safe daily intake levels.”

For the most part, however, WSU researcher and study co-author Charles Benbrook said that the study tells “a powerful story of how organic plant-based foods are nutritionally superior and deliver bona fide health benefits.” He added that he and his colleagues “learned valuable lessons from earlier reviews on this topic, and we benefited from the team’s remarkable breadth of scientific skills and experience.”

In a separate statement, Dr. Jessica Shade, Director of Science Programs for The Organic Center (TOC), said: “This is a ground-breaking study. This important research should help greatly to dispel consumer confusion about the benefits of organic. The nutritional differences between conventional and organic crops have always been a much debated topic. This significant study reevaluates the issue from a more inclusive, statistically accurate standpoint and strongly shows that organic fruits and vegetables have definite health benefits to conventionally grown products.”

The study results are scheduled to be published in the British Journal of Nutrition (available free of charge) on July 15, 2014.

—–

SHOP NOW: Trowel and Error: Over 700 Tips, Remedies and Shortcuts for the Gardener

10 causes of fibromyalgia that get overlooked

Fibromyalgia (FMS) is a neurological condition characterized by a variety of symptoms, but most pertinently, by widespread pain in “tender points” throughout the body. The causes of the syndrome are largely a mystery, which makes both diagnosis and treatment a challenge. Nonetheless, the syndrome is widespread, affecting more than 10 million people in the United States alone.

Because conventional medicine hasn’t discovered how fibromyalgia works, treatment has to be a little bit of “trial and error.” Diagnosing the syndrome in the first place can even be a challenge, and then it’s likely that your health care provider will combine drugs, natural remedies and other measures to give you some relief. Below we also list some other causes of the syndrome that both you and your doctor might overlook.

1. Gluten intolerance

Intolerance of gluten is called celiac disease and is actually an autoimmune disorder. Gluten is the protein in wheat and other grains, and intolerance is almost completely determined by genetics. You are far more likely to have celiac disease if it is prevalent in your family.

While the disease is obviously related to what you eat, it doesn’t necessarily present as a digestive issue. Rather, as it is autoimmune in nature, so it presents as a neurological condition. Patients might have pain, cognitive impairment, sleep disturbances, behavioral issues, fatigue and depression. These are often the same kind of symptoms as fibromyalgia, so it can often be confused with that condition.

In fact, gluten intolerance has been linked to 55 different diseases, which can mean that your symptoms might be determined by a dietary issue.

2. Candida overgrowth

Candida is a natural and helpful part of the natural fauna of our intestines. Usually, a very small amount of it lives in our intestines. However, when you get too much sugar in your diet, this yeast gets overproduced. Living in North America, getting too much sugar in your diet can be very difficult to avoid. Not only do you have to be aware of the sugar you add, but many foods already have a great deal of sugar added.

When there’s too much of candida, this normally beneficial yeast breaks down the wall of the intestines and penetrates the bloodstream, releasing toxic by-products into your body and causing a host of unpleasant symptoms such as brain fog, fatigue, digestive issues and pain. These symptoms will no doubt sound very familiar to anyone suffering fibromyalgia. Again, your suffering might just be a result of diet.

causes of fibromyalgia

3. Thyroid

Symptoms of hypothyroidism and fibromyalgia can be very similar. Many of the neurological complaints that come with fibromyalgia also occur in hypothyroidism, including brain fog, sleep issues and depression. The major difference, of course, is that fibromyalgia sufferers also have pain, while hypothyroid patients do not.

Nonetheless, it’s important that your doctor make certain your thyroid is working properly, so that hypothyroidism can be ruled out. Thyroid function is checked through a blood test, and this will often come back negative for a person with fibromyalgia. What that might mean is that the hormone levels in the blood that the test is measuring might be just little outside of the norm. Therefore, your doctor will need to check for “optimal” levels rather than the standard levels.

4. Vitamin deficiencies

Fibromyalgia sufferers may also be deficient in various vitamins and minerals. Any of these levels are readily available from a standard blood test. The most common vitamin deficiencies are magnesium, vitamin D and B12 among fibromyalgia sufferers. Any of these can be easily supplemented by pills bought in a grocery or drugstore.

5. Small Intestine Bacterial Overgrowth (SIBO) and Leaky gut

Our bodies play host to more bacteria than we have cells. The vast majority of these bacteria are beneficial and, indeed, essential to the proper function of our body. When bacteria get either over- or under-populated through antibiotics or too much sugar in our diet, we have difficulty absorbing nutrients.

An imbalance in intestinal flora or bacteria can lead to SIBO or a condition called leaky gut. Both can be caused by too much gluten, and these conditions can lead to gluten and other food intolerances. These can, in turn, cause you to suffer from fibromyalgia. Along with problems caused by candida overgrowth, this shows the importance of dealing with digestive issues in anyone with fibromyalgia.

6. Mycotoxins

Fibromyalgia can be brought on by environmental factors as well as issues within the body itself. Mycotoxins are toxic substances released by molds in the environment. These toxins can also cause you to begin suffering fibromyalgia symptoms. Elevated levels can be determined from a simple blood test.

If it’s found that this is the issue in your case, the solution might be as simple as moving from your office to a less moldy area, and getting treated for the toxins you’ve already been exposed to. Of course, if the mold is in your home, it might be a more complicated process to remove it from your environment.

7. Mercury toxicity

Many people have silver fillings in their teeth. These fillings are actually about half mercury, and every time you chew, grind your teeth or have a cleaning, they release some mercury into your body. There are also sources of mercury in other places in the environment. Mercury can be found in fertilizers, pesticides, car exhaust, drinking water, fabric softeners, fish, paint pigments, floor waxes, polishes, batteries, mascara, body powder and air conditioning filters.

Mercury poisoning can cause fibromyalgia among a variety of other diseases and syndromes, including Alzheimer’s, asthma, arthritis, ALS, Crohn’s Disease, diabetes, acrodynia, emphysema, eczema and autism. Unfortunately, testing for levels of mercury in the body isn’t as simple as a blood test, since mercury is locked in tissues, rather than floating in the blood. However, if you have silver fillings, it’s worth investigating.

8. Adrenal fatigue

Adrenal fatigue is a slowing down of the adrenal glands to levels which cause you difficulty. It is often the result of long-term stress, but it can also arise because of long-term infections. It results in a feeling of chronic exhaustion and so a patient will often self-medicate with coffee, colas or other stimulants to get themselves going.

Fibromyalgia and adrenal fatigue often go together, and people who suffer from FMS will often also have at least a certain amount of adrenal gland malfunction. It isn’t clear whether the prolonged stress of FMS pain has caused the adrenal fatigue, or whether the fatigue caused the FMS in the first place, but what is clear is that the two syndromes often go together. Measures to support adrenal function can help with some aspects of FMS.

9. MTHFR mutations

Mutations to a particular gene called the MTHFR gene. The gene regulates the production of an enzyme that processes folate so that the body can use it. Dysfunction in this gene disrupts your normal process of “Methylation,” which affects many essential bodily functions and can thereby lead to FMS.

However, this mutation can be worked around to some extent. Dietary supplements can be taken to substitute for some of the nutrients that you aren’t getting because of problems with your MTHFR gene.

You can find out about the condition of the MTFHR gene through a genetic test that any conventional lab can provide.

10. Glutathione deficiency

Glutathione (pronounced “gloota thigh own”) is an essential part of the body’s immune system. It is the primary agent responsible for detoxification and is also the most important antioxidant in cells. It is a small protein produced naturally by the cells, and is the basis of a healthy immune response and the ability to control pain. Unfortunately, it is common for FMS sufferers to be deficient in this protein.

This protein cannot simply be supplemented in pill form, but needs to be produced in the cells themselves. Therefore, it is necessary to increase the glutathione precursors in the diet. This can be done either through the food you eat, or by consuming supplements of the necessary nutrients.

As you can see, there are a wide variety of things that may be causing your fibromyalgia. Because FMS doesn’t have a simple test to diagnose, it will be incumbent on you and your doctor to investigate different options. Also, what many of these conditions are dependent on are dietary issues. Either too little of an important nutrient is missing, or too much of a harmful substance. Either way, it is obvious that regulating your diet is as important as anything your doctor can do for you. Obviously, diet is not the only concern, but it plays an important part in dealing with FMS.

Further Reading

“What Primary Physicians Should Know about Environmental Causes of Illness.” by William J. Rea, MD. Virtual Mentor. American Medical Association. Vol. 11, No. 6. pp. 473-476. http://virtualmentor.ama-assn.org/2009/06/oped1-0906.html.

“Glutathione(GSH).” By Jimmy Gutman, MD. American Healthcare Foundation. http://www.americanhealthcarefoundation.org/fibromyalgia-md/GSH.cfm.

“10 Causes Of Fibromyalgia Your Doctor Doesn’t Know About.” by Dr. Amy Myers. http://www.mindbodygreen.com/0-10103/10-causes-of-fibromyalgia-your-doctor-doesnt-know-about.html.

“Adrenal Function in Fibromyalgia.” AdrenalFatigue.org. http://www.adrenalfatigue.org/fibromyalgia.

MIT Develops New System That Lets Users Decide What Data Is Shared

Brett Smith for redOrbit.com – Your Universe Online

While issues of personal data security have mostly revolved around actions of the US federal government, vast amounts of personal data are also being collected by corporate entities such as Amazon and Netflix.

In pursuit of protecting individual privacy, researchers at MIT have developed a system called OpenPDS that allows individual users to decide what data they want to share and what data is unavailable.

According to a report published in the journal PLOS ONE, openPDS would store personal data from a person’s devices in a single specific location, such as an encrypted server or a personally-owned computer. Any company or study team that wants to access this data would have to query the person’s database, which gives as little data as is necessary.

“The example I like to use is personalized music,” said study author Yves-Alexandre de Montjoye, a graduate student in media arts at MIT. “Pandora, for example, comes down to this thing that they call the music genome, which contains a summary of your musical tastes. To recommend a song, all you need is the last 10 songs you listened to — just to make sure you don’t keep recommending the same one again — and this music genome. You don’t need the list of all the songs you’ve been listening to.”

The MIT researchers said one of the biggest benefits of the system is that it would require applications to specify what information they need, instead of the current arrangement that simply informs a user that their data is being accessed and used.

“When you install an application, it tells you ‘this application has access to your fine-grained GPS location,’ or it ‘has access to your SD card’,” de Montjoye said in a statement. “You as a user have absolutely no way of knowing what that means. The permissions don’t tell you anything.”

Because they can, most applications collect much more information than they actually need. The thinking is: what may have seemed like superfluous information in the past could be highly valuable in the future.

The openPDS system stores all potentially useful data, but with the user, not the application maker or service provider, like Netflix. A developer who finds out that a prior unwanted bit of data is beneficial must ask for access to it from the user. If the inquiry seems needlessly intrusive, the user can simply reject it.

The openPDS developers conceded that an entity could game the system in its current state by requesting seemingly innocuous bits of data and piecing together a person’s identity. In order to safeguard against this type of privacy invasion, users would have to enact measures on a case-by-case basis. However, the novel openPDS system is a work in progress and a step up from the current situation, the MIT developers said.

“If we manage to get people to have access to most of their data, and if we can get the overall state of the art to move from anonymization to interactive systems, that would be such a huge win,” de Montjoye said.

—–

PROTECT YOURSELF TODAY – Norton Antivirus

Foods Rich In Antioxidants May Promote Cancer Growth

Rebekah Eliason for redOrbit.com – Your Universe Online

Around the globe, health conscious people have sought out antioxidant supplements and eaten diets rich in antioxidants for decades in an attempt to live healthy lives for a long time. Surprisingly, recent clinical trials have dashed the hopes of people taking antioxidant supplements to reduce the risk of cancer.

In almost all antioxidant trials, there have been no protective effects against cancer. Among several of the trials, researchers found a link between the supplements and an increased risk of certain types of cancer. One specific trial found that smokers who take extra beta-carotene increased their risk of lung cancer instead of lowering the risk.

The authors proposed in a brief paper why antioxidant supplements might not reduce the development of cancer and why they actually seem to be harmful.

The team’s discoveries are based on recent information about the understanding of the system in our cells that works to naturally balance between oxidizing and anti-oxidizing compounds. These chemical compounds are involved in a reaction known as redox, which are essential to cellular chemistry.

Within cells, essential oxidants such as hydrogen peroxide are manufactured. It is understood that in large quantities oxidants are toxic, but cells naturally create their own antioxidants to neutralize them and keep the balance. Many people decided to give the body a boost in this process by taking lots of antioxidants to counter the toxic effects of hydrogen peroxide and other “reactive oxygen species,” or ROS, as they are called by scientists. Since cancer cells are known to generate higher ROS levels to feed their growth, people were all the more anxious to help the body neutralize the toxic species.

Study leaders David Tuveson, MD, PhD, Cold Spring Harbor Laboratory Professor and Director of Research for the Lustgarten Foundation, and Navdeep S. Chandel, PhD, suggest that taking pills or eating antioxidant foods are ineffective because they do not reach the critical site in cells where the tumor-promoting ROS are produced. Instead, supplements and dietary antioxidants will accumulate in other areas of the cell, “leaving tumor-promoting ROS relatively unperturbed,” the researchers say.

In cancer cells, quantities of both ROS and natural antioxidants are higher. Higher levels of antioxidants are a natural defense by cancer cells to keep high oxidant levels in check to enable continued growth. In fact, say Tuveson and Chandel, therapies that raise the levels of oxidants in cells may be beneficial, whereas those that act as antioxidants may further stimulate the cancer cells. Radiating actually kills cancer cells by dramatically raising oxidant levels. Chemotherapeutic drugs actually work the same way by killing tumor cells from oxidation.

The authors then suggest in an interesting paradox that, “genetic or pharmacologic inhibition of antioxidant proteins,” which is a concept that has been successfully tested in models of rodent lung and pancreatic cancers, might be a helpful therapeutic approach in humans. The authors say the key is to identify the antioxidant proteins and pathways used only by cancer cells and not healthy cells. If antioxidant production is impeded in healthy cells, the delicate redox balance of normal cellular function will be destroyed.

In further research, the authors propose to profile antioxidant pathways in tumor and adjacent normal cells in hopes of identifying possible therapeutic targets.

This study was published July 10, 2014 in The New England Journal of Medicine.

—–

GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now

Hot Days May Increase Odds Of Developing Kidney Stones

Rebekah Eliason for redOrbit.com – Your Universe Online

According to a new study, as daily temperatures rise, more patients seek treatment for kidney stones. The research team discovered among several US cities with varying climates there is a correlation between hot days and kidney stones in 60,000 people.

“We found that as daily temperatures rise, there is a rapid increase in the probability of patients presenting over the next 20 days with kidney stones,” said study leader Gregory E. Tasian, MD, MSc, MSCE, a pediatric urologist and epidemiologist at The Children’s Hospital of Philadelphia (CHOP), who is on the staff of the Hospital’s Kidney Stone Center as well as the Hospital’s Center for Pediatric Clinical Effectiveness (CPCE).

The research team analyzed medical records of both adults and children with kidney stones. Over 60,000 records between the years of 2005 and 2011 were studied along with weather reports in each patient’s area. The team reviewed patient data from Atlanta, Chicago, Dallas, Los Angeles and Philadelphia.

As the mean daily temperatures rose above 50 F (10 C), the team discovered the risk of kidney stones increased in every city except Los Angeles. There was only a short delay between high daily temperature and kidney stone presentation. Within three days of hot day exposure, the level of kidney stones peaked.

“These findings point to potential public health effects associated with global climate change,” said Tasian.

“However,” he cautions, “although 11 percent of the U.S. population has had kidney stones, most people have not. It is likely that higher temperatures increase the risk of kidney stones in those people predisposed to stone formation.”

Since higher temperatures lead to dehydration, during hot days there are higher levels of calcium and other minerals that lead to the growth of kidneys stones present in people’s urine.

Approximately half a million patients each year in the US seek help in the emergency room for this painful condition. Over the past three decades, the amount of patients with kidney stones has markedly increased. The condition is more common in adults, but the amount of children developing kidney stones in the last 25 years has also dramatically increased. Currently the factors causing the rise in kidney stones are unknown, but suggested theories include changes in diet and fluid intake. If stones do not pass by themselves, surgery becomes necessary.

Additionally, the team discovered that in three cities — Atlanta, Chicago and Philadelphia — extremely low outdoor temperatures also increased the number of kidney stones. According to the authors, this may be a result of people staying indoors among warmer temperatures, diet changes and decreased physical activity.

Tasian added that while the five US cities have climates representative of those found throughout the world, future studies should explore how generalizable the current findings are.

Other studies should analyze how risk patterns vary in different populations, including among children, represented by a small sample size in the current study.

In addition to analyzing kidney stones, this study was intended to see how patterns of global warming affect health. “Kidney stone prevalence has already been on the rise over the last 30 years, and we can expect this trend to continue, both in greater numbers and over a broader geographic area, as daily temperatures increase,” concluded Tasian. “With some experts predicting that extreme temperatures will become the norm in 30 years, children will bear the brunt of climate change.”

This study was published in Environmental Health Perspectives.

—–

Shop Amazon – Soak Up Summer

New Evidence Refutes The ‘Birds Evolving From Dinosaurs’ Theory

Gerard LeBlond for redOrbit.com – Your Universe Online

It has long been perceived that the modern-day bird evolved from the dinosaur millions of years ago. However, evidence from a new study published in the Journal of Ornithology has challenged this common belief.

After re-examining a bird-like fossil from China, it was discovered that it was not a dinosaur as first thought. Instead, it was a tree-climbing animal that could glide, according to researchers Stephen Czerkas from The Dinosaur Museum in Blanding, Utah and Alan Feduccia from the University of North Carolina.

The fossil in question is a Scansoriopteryx — meaning “climbing wing” — that was found in Inner Mongolia. It had been classified as a coelurosaurian theropod dinosaur from which the modern bird was believed to have evolved from. The new evidence was discovered when the researchers used advanced 3D microscopy, high resolution photography and low angle lighting, revealing new bone structures previously not visible. The pelvis, forelimbs, hind limbs and tail were confirmed, but elongated tendons on the tail vertebrae — similar to a Velociraptor — were discovered.

The duo say that the findings of the skeletal features lacks the essential features to classify it as a dinosaur, leading them to suggest birds did not evolve from the dinosaur. The Scansoriopteryx should be classified as an early bird whose ancestors are tree-climbing archosaurs that were around before the dinosaurs existed.

The new findings reveal many non-dinosaurian traits with distinct birdlike features. It has elongated forelimbs, wing and hind limb feathers, wing membranes, birdlike perching feet, claws for tree climbing and other birdlike features. This new evidence reveals the Scansoriopteryx is an early form of bird that developed the ability to glide from tree to tree.

In the early 1900s, predictions were made that the bird’s ancestors were tree dwelling small archosaurs with the ability to fly, or at least glide. The concept of “ground up” evolution of birds previously viewed is in direct contrast with the new evidence found by the study. A new view of “trees down” is now being applied to the evolution of birds.

“The identification of Scansoriopteryx as a non-dinosaurian bird enables a reevaluation in the understanding of the relationship between dinosaurs and birds. Scientists finally have the key to unlock the doors that separate dinosaurs from birds,” explained Czerkas.

“Instead of regarding birds as deriving from dinosaurs, Scansoriopteryx reinstates the validity of regarding them as a separate class uniquely avian and non-dinosaurian,” Feduccia added.

—–

Join Amazon Student – FREE Two-Day Shipping for College Students

People Increasingly Drawn To YouTube For Skin Cancer Information

Brett Smith for redOrbit.com – Your Universe Online
While YouTube might be a popular destination for viral videos and sports highlight reels, it’s also useful for raising awareness of skin cancer and prevention techniques, according to a news report in the Dermatology Online Journal.
“No matter what field you’re in, social media is the future of how we communicate around the world,” said study author Chante Karimkhani, a doctoral candidate at the University of Colorado’s Cancer Center.
The researchers investigated YouTube search terms related to dermatology such as “sun protection, skin cancer, skin cancer awareness, and skin conditions.” Results included 100 videos having a collective 47 million views. The video clips were shared more than 100,000 times and drove over 6,300 subscriptions to distinct YouTube channels.
For instance, of the videos for “skin cancer,” 25 percent were academic and a different 25 percent were what the scientists described as “complementary and alternative medicine videos.” Overall, just 35 percent of videos across all dermatology search terms we uploaded by or included a biomedical expert.
The CU researchers said these results indicate a new chance to spread research straight to the public.
“It used to be that researchers and journals depended on independent media to interpret their findings for the public. It could be a little like a game of telephone,” Karimkhani said. “Now through social media, journals can have their own presence – their own mouthpiece directly to the public that may include patients or health care providers or even other researchers.”
The research pointed to the tanning industry’s presence on YouTube and suggested that viewers should be interested in dermatology issues as well.
The scientists said that as more academic institutions, scientists and journals acknowledge the promise and take on the challenge of social media, data straight from these reputable and well-meaning sources could possibly modify the popular conversation.
The need for greater education was on full display last month when dermatologists conducted more than 710 skin cancer screenings at the Aspen Ideas Festival (AIF) and found 190 precancerous lesions, 80 atypical moles and 89 potential non-melanoma skin cancers.
Mount Sinai Health System in New York City, which sent the dermatologists, said the festival made for the ideal screening location, given its given high altitude, thinner atmosphere, and higher levels of UV radiation.
“Skin cancer is the most common form of cancer in the world, but also one of the most preventable and treatable forms,” said Dr. Mark Lebwohl, chairman of the department of dermatology in the Icahn School of Medicine at Mount Sinai. “When caught early the cure rate is nearly 100 percent. In addition, our department is working on innovations that we hope will identify skin cancers earlier and treat them more effectively.”
“A lot of people said they haven’t been screened in years, which was surprising because some had a family history of melanoma,” said Dr. Rita Linker, who participated in the screenings and found one potential melanoma. “I was also surprised about the amount of sun damage among young people.”

Researchers Develop Novel Method To Observe Mysterious Photosynthesis Process

Brett Smith for redOrbit.com – Your Universe Online

In the pursuit of a renewable energy source, scientists have been trying to understand the exact mechanism behind photosynthesis, and now a large team of scientists has successfully captured the detailed “snapshots” of the process using a powerful laser, according to a report in Nature Communications.

“An effective method of solar-based water-splitting is essential for artificial photosynthesis to succeed but developing such a method has proven elusive,” said study author Vittal Yachandra, a chemist with the Lawrence Berkeley National Laboratory.

“The water splitting process is known to be divided into four steps,” noted study author Henry Chapman, a professor at the University of Hamburg and a member of the Hamburg Center for Ultrafast Imaging CUI. “But no-one has actually seen these four steps.”

The researchers investigated a process known as photosynthesis II, which involves a manganese-calcium complex that catalyzes the cycle that yields molecular oxygen when energized by solar photons. To observe the process, the team grew miniscule nano-crystals of the photosystem II complex using bacteria that employ photosynthesis.

These crystals were then lit up with a powerful laser to spark the water splitting process. The scientists used double light flashes to induce the changeover from phase S1 to stage S3, as this transition was anticipated to show the most important dynamics. The scientists watched how the molecular structure of the photosystem II complex changed during the entire process.

“We were surprised by the large conformational changes we could witness,” said team member Petra Fromme, a bio-physical chemist from Arizona State University. “Actually, the changes are so large that there is an overall structure change, which even changes the dimensions of the unit cell, the smallest building block in a crystal.”

The division of water during photosynthesis is a catalytic process, which means photosystem II enables the reaction without being spent. Catalysis plays a major role in many areas of chemistry.

“The technique we employed has a huge potential not only for photosynthesis, but for catalysis in general,” Fromme said. “If you would be able to observe all steps of a catalytic reaction, you would be able to optimize it.”

“Our study also proves that molecular movies of biochemical processes are possible with a X-ray Free-Electron Laser,” Chapman said.

The research team triggered the reaction numerous times as they monitored it with precisely delayed X-ray flashes. The process produced a series of still frames that can be combined into a molecular movie.

“Such a movie can reveal the ultrafast dynamics of chemical reactions,” Chapman added. “But we still need to get to higher resolution first.”

The researchers said they are getting “tantalizingly close” to being able to engineer photosynthesis.

“This is a major step toward the goal of making a movie of the molecular machine responsible for photosynthesis, the process by which plants make the oxygen we breathe, from sunlight and water,” explained team member John Spence, an ASU professor of physics and scientific leader of the National Science Foundation funded BioXFEL Science and Technology Center.

—–

SHOP NOW: Trowel and Error: Over 700 Tips, Remedies and Shortcuts for the Gardener

Insecticides Linked To Farmland Bird Population Declines

Brett Smith for redOrbit.com – Your Universe Online

When they were introduced in the 1990s, a class of insecticides called neonicotinoids was thought to be highly targeted toward pests, but mounting evidence is suggesting that they have had a negative effect on the larger ecosystem.

A new study in the journal Nature has found that use of neonicotinoids is linked to a decline in the populations of farmland birds across Europe.

For the study, scientists from Radboud University in the Netherlands and the Dutch Centre for Field Ornithology and Birdlife Netherlands (SOVON) analyzed long-term data for both farmland bird populations and chemical amounts in surface water. They discovered that in locations where water held high amounts of imidacloprid, a standard neonicotinoid, bird populations were known to decrease by an average of 3.5 percent on a yearly basis.

“In ten years it’s a 35 percent reduction in the local population, it’s really huge,” study author Hans de Kroon from Radboud University told Matt McGrath of BBC News. “It means the alarm bells are on straight away.”

The study team said the insecticide is probably coating seeds that the birds like to eat – as well as leaching into both water and soil around the sprayed areas. They added that neonicotinoids can persist in the environment for up to three years.

“They might be less able to produce their young, or grow their young,” said lead author Caspar Hallman from Radboud University referring to the disappearing farmland birds. “It might increase their mortality by food deprivation, we think this is the most likely mechanism.”

The Dutch researchers also looked to see if the declines in population had started before the widespread use of neonicotinoids and if land use played a role in the declines. Neither was found to be a significant factor.

“I think that the information we have been able to provide is exactly the information that was missing in most studies,” de Kroon said. “In that sense this could be the definite smoking gun, you now see the evidence getting more complete, around the effects of imidacloprid on the environment.”

Bayer, which makes imidacloprid, said the new study doesn’t show a “causal link” between the chemical and the drop in bird populations.

“Neonicotinoids have gone through an extensive risk assessment which has shown that they are safe to the environment when used responsibly according to the label instructions,” a spokesman told the BBC. “Birds living close to aquatic habitats – the species that one could expect to be affected most by concentrations of neonicotinoids in surface water – show no or negligible negative impact.”

“Neonicotinoids were always regarded as selective toxins. But our results suggest that they may affect the entire ecosystem,” de Kroon said. “This study shows how important it is to have good sets of field data, and to analyze them rigorously.”

“Thanks to our partnership with organizations such as Sovon, we can discover ecological effects that would otherwise be overlooked,” he added.

The new Dutch study adds to other research showing the use neonicotinoids is linked to significant decline in bee populations.

—–

Shop Amazon – Soak Up Summer

DARPA’s Restoring Active Memory Program Poised To Launch

Alan McStravick for redOrbit.com – Your Universe Online

Just over a year ago I reported on the announcement of the BRAIN (Brain Research through Advancing Innovative Neurotechnologies) Initiative, the scientific cousin to the wildly successful Human Genome Project.

In his 2013 State of the Union address, President Obama announced the new study saying, “Every dollar we invested to map the human genome returned $140 to our economy – every dollar. Today our scientists are mapping the human brain to unlock the answers to Alzheimer’s. They’re developing drugs to regenerate damaged organs, devising new materials to make batteries 10 times more powerful. Now is not the time to gut these job-creating investments in science and innovation.”

A year after the ambitious announcement, redOrbit detailed some of the early work already commencing in the field of anxiety and depression and how the work being done will be aimed initially at helping members of our armed services who have returned from battle theaters in Iraq and Afghanistan suffering from post-traumatic stress disorder (PTSD) and traumatic brain injuries (TBI).

The Defense Advanced Research Projects Agency (DARPA) is the arm of the government tasked with identifying academic and private institutions currently working on research projects that can further the BRAIN Initiative. This week DARPA announced they have awarded funding to two more universities to aid them in launching their programs for restoring active memory. This latest program is called ‘Restoring Active Memory’ (RAM).

According to a statement by DARPA, the University of California, Los Angeles (UCLA) and the University of Pennsylvania (Penn) will serve as leaders of a multidisciplinary team tasked with developing and testing electronic interfaces that can sense memory deficits as a result of an injury. The devices are also intended and expected to help restore normal brain function to the affected area.

Funding in the amounts of $15 million and $22.5 million will be granted to UCLA and Penn, respectively. The full funding, over a four year time period, will be contingent upon both institutions consistently meeting a series of technical milestones. An additional $2.5 million grant has been earmarked for the Lawrence Livermore National Laboratory for the development of an implantable neural device. This device is intended as a supplement to the UCLA-led effort.

“The start of the Restoring Active Memory program marks an exciting opportunity to reveal many new aspects of human memory and learn about the brain in ways that were never before possible,” said DARPA Program Manager Justin Sanchez. “Anyone who has witnessed the effects of memory loss in another person knows its toll and how few options are available to treat it. We’re going to apply the knowledge and understanding gained in RAM to develop new options for treatment through technology.”

As mentioned above, much of the initial benefit is aimed at returning servicemembers who have suffered one form or another of a TBI. However, this injury affects more than just those coming home from the war fronts. It is estimated that as many as 1.7 million US civilians also suffer from a TBI. The condition frequently results in a reduced capacity to form or retain new memories following the injury as well as causing difficulty in retrieving memories formed prior to the incident.

Caretakers recognize the insidious nature of TBIs and frequently lament the fact that there are currently no effective therapies in existence to alleviate or mitigate the long-term consequences. DARPA hopes the RAM program will be the necessary driver for the future development of TBI focused technologies.

“We owe it to our servicemembers to accelerate research that can minimize the long-term impacts of their injuries,” Sanchez said. “Despite increasingly aggressive prevention efforts, traumatic brain injury remains a serious problem in military and civilian sectors. Through the Restoring Active Memory program, DARPA aims to better understand the underlying neurological basis of memory loss and speed the development of innovative therapies.”

The goal of the RAM program, specifically, is to develop and test wireless, fully implantable neural-interface medical devices that can be used as “neuroprosthetics.” These devices will serve as a sort of bridge across the TBI induced gaps that interfere with an individual’s ability to encode new memories and retrieve old ones.

The UCLA team will focus on a region of the brain known as the entorhinal area. Previous research done at the university showed human memory could be facilitated by stimulating this portion of the brain, known to be involved in both learning and memory. This region, considered by neuroscientists to be the front door to the hippocampus, is integral in helping situations encountered in one’s day-to-day become lasting memories.

On the other side of the country, the Penn team’s approach will be based on the understanding that memory is the result of complex interactions among multiple regions of the brain. This group of researchers will record the neural activity of patients who have already had electrodes implanted in multiple areas of the brain. The neural activity will be captured as the participants play computer-based memory games. The team will measure the biomarkers of successful memory function – patterns of activity that accompany the successful formation of new memories and the successful retrieval of old ones. This early work by the Penn team will lead to a better understanding of the brain and how brain stimulation therapy can possibly aid in restoring normal brain function following injury or the onset of a neuropsychological illness, like Alzheimer’s disease.

The RAM program, in addition to its human trials, will also support animal studies meant to advance the state-of-the-art quantitative models that currently account for the encoding and retrieval of complex memories and memory attributes.

DARPAs neuroscience efforts in the BRAIN Initiative, of which RAM is a part, are informed by members of an independent ethical, legal and social implications (ELSI) panel. This panel is employed as a means of oversight to supplement the existing institutional review boards present on the research lead’s campuses that govern human clinical studies and animal use.

—–

Join Amazon Student – FREE Two-Day Shipping for College Students

Low Back Pain Not Caused By The Weather

Wiley
Australian researchers reveal that sudden, acute episodes of low back pain are not linked to weather conditions such as temperature, humidity, air pressure, wind direction and precipitation. Findings published in Arthritis Care & Research, a journal of the American College of Rheumatology (ACR), indicate that the risk of low back pain slightly increases with higher wind speed or wind gusts, but was not clinically significant.
According to the World Health Organization (WHO) nearly everyone experiences low back pain at some point in their life, making it the most prevalent musculoskeletal condition and affecting up to 33% of the world population at any given time. Those with musculoskeletal (bone, muscle, ligament, tendon, and nerve) pain report that their symptoms are influenced by the weather. Previous studies have shown that cold or humid weather, and changes in the weather increase symptoms in patients with chronic pain conditions.
“Many patients believe that weather impacts their pain symptoms,” explains Dr. Daniel Steffens with the George Institute for Global Health at the University of Sydney, Australia. “However, there are few robust studies investigating weather and pain, specifically research that does not rely on patient recall of the weather.”
For the present case-crossover study 993 patients seen at primary care clinics in Sydney were recruited between October 2011 and November 2012. Weather data from the Australian Bureau of Meteorology were sourced for the duration of the study period. Researchers compared the weather at the time patients first noticed back pain (case window) with weather conditions one week and one month before the onset of pain (control windows).
Results showed no association between back pain and temperature, humidity, air pressure, wind direction or precipitation. However, higher wind speed and wind gusts did slightly increase the chances of lower back pain, but the amount of increase was not clinically important.
“Our findings refute previously held beliefs that certain common weather conditions increase risk of lower back pain,” concludes Dr. Steffens. “Further investigation of the influence of weather parameters on symptoms associated with specific diseases such as fibromyalgia, rheumatoid arthritis, and osteoarthritis are needed.”

Australian Milking Zebu Cattle

The Australian Milking Zebu (AMZ) is a breed of dairy cattle that originated in Australia in the 1950s. It was developed by the Commonwealth Scientific and Industrial Research Organization (CSIRO) by crossbreeding the Sahiwal and Red Sindhi cattle from Pakistan with Jersey cattle. A small amount of Illawarra, Guernsey and Friesian blood was introduced as well.

It was bred specifically to be resilient to heat, cattle tick and for milk production. The AMZ is similar in color and markings to the Jersey cattle, but with the Sahiwal and Red Sindhi’s loose skin. The cow will produce approximately 744 gallons of milk per lactation period.

Image Caption: Australian Milking Zebu Cattle. Credit: Wikipedia (public domain)

Smallpox Vials Discovered In A Federal Lab Closet

Brett Smith for redOrbit.com – Your Universe Online

Smallpox is a deadly virus that was all but wiped off the face of the Earth in the late 1970s, but on Tuesday the Centers for Disease Control and Prevention (CDC) announced that random bottles of smallpox were found in at a federal lab near Washington, DC. This is the second such announcement within the past month.

In a statement, the CDC said that its employees had stumbled upon “vials labeled ‘variola,’ commonly known as smallpox, in an unused portion of a storage room in a Food and Drug Administration (FDA) laboratory.”

The six vials appeared intact, sealed with melted glass and holding freeze-dried smallpox virus. There was no evidence that lab workers or the larger public are at risk, CDC spokesman Tom Skinner told Reuters.

“The vials appear to date from the 1950s,” the CDC statement continued. “Upon discovery, the vials were immediately secured in a CDC-registered select agent containment laboratory in Bethesda, [Maryland].”

The recent discovery follows an incident involving the CDC possibly sending live anthrax specimens to a CDC lab that was ill-prepared to deal with them, potentially subjecting dozens of staff to the pathogen.

Skinner told Reuters that the discovered smallpox vials will be tested to see if the virus is alive and dangerous. After the tests, which may take up to 14 days, the samples will likely be destroyed, he added.

Smallpox has essentially been wiped out for the past 35 years, but the CDC keeps samples of the virus that causes it for research at its facility in Atlanta. Samples of the pathogen are also kept at the State Research Center of Virology and Biotechnology in Novosibirsk, Russia.

The CDC said it had notified the World Health Organization about the discovery and invited the international organization to observe the destruction of the old specimens.

In April, an investigation into smallpox epidemics revealed that children born during them are more resistant to other pathogens later in life.

In the study, scientists from the Max Planck Institute for Demographic Research (MPIDR) in Germany found children who were conceived during the wave of measles that hit the Canadian province of Quebec in 1714 and 1715 died significantly less often from smallpox 15 years on compared to children who had been conceived before the measles epidemic.

“We have proved that parents can essentially prepare their children for future diseases,” said study author Kai Willführ, a bio-demographer at Max Planck. “The underlying mechanism is not purely genetic, nor is the children’s resistance restricted to single pathogens.”

Past research has referred to this a “functional trans-generational effect.” Parents who encountered an increased disease load during conception gave their children protection against the encountered infection, as well as against different illnesses.

“The way children’s bodies fight diseases seems to be optimized for a world with high pathogen load if it was also high at conception,” Willführ said. “It was only during conception and pregnancy that measles could have given an advantage that parents passed on to the next generation.”

—–

Shop Amazon – Create an Amazon Baby Registry

Newborn Night-Waking Has A Detrimental Health Impact On Parents

April Flowers for redOrbit.com – Your Universe Online
One of the hardest parts of new parenthood is never getting enough sleep. Once an hour, every night, the baby cries out for food, comfort, or to be cleaned. Most new parents report feeling more exhausted in the morning than they did the night before.
Professor Avi Sadeh led a team of researchers from Tel Aviv University’s (TAU) School of Psychological Sciences to conduct the first study to examine why interrupted sleep can be as physically detrimental as no sleep. The findings, reported in Sleep Medicine, reveal a causal link between interrupted sleep patterns and many negative effects, including compromised cognitive abilities, shortened attention spans, and negative moods. The team, which included Michal Kahn, Shimrit Fridenson, Reut Lerer, and Yair Ben-Haim, found that interrupted sleep is equivalent to no more than four consecutive hours of sleep.
“The sleep of many parents is often disrupted by external sources such as a crying baby demanding care during the night. Doctors on call, who may receive several phone calls a night, also experience disruptions,” said Prof. Sadeh. “These night wakings could be relatively short — only five to ten minutes — but they disrupt the natural sleep rhythm. The impact of such night wakings on an individual’s daytime alertness, mood, and cognitive abilities had never been studied. Our study is the first to demonstrate seriously deleterious cognitive and emotional effects.”
“In the process of advising these parents, it struck me that the role of multiple night wakings had never been systematically assessed,” said Prof. Sadeh, who directs a sleep clinic at TAU. Professor Sadeh advises exhausted and desperate parents on how to cope with their children’s persistent night wakings. “Many previous studies had shown an association, but none had established a causal link. Our study demonstrates that induced night wakings, in otherwise normal individuals, clearly lead to compromised attention and negative mood.”
Student volunteers at TAU’s School of Psychological Sciences participated in the study, where their sleep patterns were monitored at home using wristwatch-like devices. The devices detected both sleep and wake states. First, the students slept a full eight-hour night. The second night, they were awakened four times by phone calls. After each call, they had to complete a small computer task before returning to bed after 10-15 minutes of wakefulness. Following each night, the students were asked to complete certain computer tasks to assess alertness and attention. They also completed a questionnaire to assess their mood. Even after only one night of interruptions, the research team found a direct correlation between compromised attention, negative mood, and disrupted sleep.
“Our study shows the impact of only one disrupted night,” said Prof. Sadeh. “But we know that these effects accumulate and therefore the functional price new parents — who awaken three to ten times a night for months on end — pay for common infant sleep disturbance is enormous. Besides the physical effects of interrupted sleep, parents often develop feelings of anger toward their infants and then feel guilty about these negative feelings.”
“Sleep research has focused in the last 50 years on sleep deprivation, and practically ignored the impact of night-wakings, which is a pervasive phenomenon for people from many walks of life. I hope that our study will bring this to the attention of scientists and clinicians, who should recognize the price paid by individuals who have to endure frequent night-wakings.”
The team is continuing their research by investigating interventions for infant sleep disturbances to decrease the detrimental effects on parents.
—–
Shop Amazon – Create an Amazon Baby Registry

Investigating The Impact Behavioral Factors Have On Life Expectancy

redOrbit Staff & Wire Reports – Your Universe Online

Eating healthy, being physically active, limiting alcohol consumption and avoiding cigarettes could add 10 years to your life, public health physicians from the University of Zurich report in what is being called the first study to ever investigate the impact of behavioral factors on life expectancy in numbers.

Writing in a recent edition of the journal Preventive Medicine, lead author Eva Martin-Diener of the university’s Institute of Social and Preventive Medicine (ISPM) and his colleagues examined the impact of the World Health Organization’s four behavioral risk factors for non-communicable diseases on a person’s life expectancy.

Each of the factors (smoking, alcohol, poor diet and inactivity) were analyzed both individually and combined, allowing them for the first time to depict the consequences of an unhealthy in numbers. The study authors looked at over 16,000 individuals who participated in two Swiss population studies from between 1977 and 1993, assessing smoking status, alcohol consumption, physical activity level and diet at baseline.

They discovered that a person who smokes, drinks a lot, tends to be physically inactive and has an unhealthy diet has a mortality rate that is 2.5 times higher in epidemiological terms than a person who is health-conscious, the authors said. In fact, a healthy lifestyle “can help you stay ten years’ younger,” Martin-Diener said in a statement.

GET FIT WITH FITBIT – Fitbit Flex Wireless Activity + Sleep Wristband, Black

She and her fellow investigators used information from the Swiss National Cohort (SNC) for their research, and focused their efforts on cardiovascular diseases and cancer, which are the primary causes of death in Switzerland. They then correlated data on tobacco use, fruit consumption, exercise and alcohol consumption from 16,721 individuals between the ages of 16 and 90, tracking corresponding deaths through the year 2008.

The effect of the four types of behavior was still visible when biological risk factors such as weight and blood pressure were accounted for, the study authors noted. Hazard ratios for the combination of all factors combined ranged from zero to 2.41 (1.99–2.93) in men and 2.46 (1.88–3.22) in women.

Furthermore, for 65-year-old men, the probability for surviving the next 10 years was 86 percent for those with no other risk factors and 67 percent for those with all four. For women, the numbers were 90 percent and 77 percent, respectively. In 75-year-olds, the probabilities were 67 percent and 35 percent in men, and 74 percent and 47 percent in women.

“The effect of each individual factor on life expectancy is relatively high,” said Martin-Diener. Of the four, however, smoking appeared to be the most harmful. When compared to a group of non-smokers, tobacco users have a 57 percent increased risk of premature death, while each of the other four factors (unhealthy diet, lack of exercise and alcohol abuse) resulted in an elevated mortality risk of approximately 15 percent per factor.

“We were very surprised by the 2.5 fold higher risk when all four risk factors are combined,” added fellow investigator and ISPM colleague Brian Martin. The research was financially sponsored by the Swiss Heart Foundation and the Swiss Cancer League.

Image 2 (below): Charts for probabilities of surviving the next 10 years for 65 and 75-year-olds with differing health behaviour. Legend: e. g. box top right: a 75-year-old man today who at the start of the study smoked, drank a lot and hardly ate any fruit has a 35 percent probability of surviving the next ten years. E. g. box bottom left: a 65-year-old woman with positive health behaviour in all four areas has a 90 percent probability of still being alive in ten years’ time. Credit: Preventive Medicine/UZH

Virulent Fungus Discovered In Recalled Chobani Yogurt Products

Brett Smith for redOrbit.com – Your Universe Online
Chobani yogurt recalled in September 2013 was found to contain the most virulent form of a fungus called Mucor circinelloides, according to a new research paper published in the journal mBio.
The fungus is associated with infections in immune-compromised people and has been known to survive in mice, which spread the fungus via feces for as long as 10 days after ingestion.
“Typically when people think about food-borne pathogens, they think about viruses or bacteria, they don’t think of fungi,” said study author Soo Chan Lee, a senior research associate at Duke University. “Our research suggests it may be time to think about fungal pathogens and develop good regulations to test them in manufacturing facilities.”
The study was based on a single case, a Texas couple who became ill after eating a casserole made with the recalled product. The couple said the casserole had been cooked for over 30 minutes at 350 degrees F. They both said they had diarrhea and the man said he also vomited after eating the casserole again.
The study tested the couple’s tub of plain yogurt that had been stored in their fridge at 37 degrees F, yet had half-inch to one-inch colonies growing on its surface. After using DNA barcoding technology, the team was able to identify its exact subspecies – the most virulent strain, Mucor circinelloides forma circinelloides (Mcc).
“There are three closely related species, and one of them we typically find infecting humans,” said study author Joseph Heitman, a professor of molecular genetics and microbiology at Duke. “There was some chance that this yogurt isolate would be the human pathogenic form, and we found that it was.”
To assess the virulence of the strain, the team injected spores from the fungus into mice via their tail vein, they discovered it produced a deadly infection in four out of five diabetic mice, chosen as a model for the system of immune-compromised humans. However, when mice were given spores by mouth, it resulted in serious weight loss in only one out of five cases.
The researchers also found that the fungus survived passage through the gastrointestinal tract of the mice, suggesting it could opportunistically colonize an immune-compromised host.
“We still don’t know if the fungus is infecting the gastrointestinal tract, or if it is producing some sort of toxin that makes people sick,” Heitman said.
The scientists sequenced the complete genome of the fungus to check for clusters of genes that could create toxic molecules called secondary metabolites. Even though they uncovered a quantity of candidates, the team wasn’t certain if the fungus is making any toxins that could clarify the symptoms felt by people who consumed the contaminated yogurt.
The team also tested 16 other samples of Chobani yogurt and did not find Mucor circinelloides in any of them. The scientists asked the FDA for more data on its analyses of the recalled product or access to the samples the agency had acquired. The agency denied their requests.
—–
GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now

Sleep Studies Find Contradictory Results Pertaining To Lunar Cycles

Alan McStravick for redOrbit.com – Your Universe online

Ask any emergency room nurse or doctor and they will tell you that the effect on human behavior of a full moon cannot be denied. Besides anecdotal accounts of general human lunacy when the moon looms large in the night sky, there has been murmurs for some time in the scientific community that the Earth’s natural satellite also affects our sleep. All studies thus far have been inconclusive.

As reported in Time Magazine and other outlets last year, researchers from the University of Basel, the Swiss Federal Institute of Technology and the Switzerland Center for Sleep Medicine conducted a study in 2000 that followed 33 volunteers over a period of three years in intermittent sleep lab study. Data variables collected in the study included brain wave activity during sleep, melatonin levels of the participants and the amount of time it took each individual to fall asleep and remain in a state of deep sleep. The original purpose of the study was intended simply to learn more about general human sleep patterns as they pertained to both age and gender.

What made last year’s revelation interesting was that the Swiss researchers revisited the collected data to explain how the moon played a role in human sleep patterns. Their new findings showed people average as much as 20 minutes less sleep on nights with a full moon. Additionally, achieving sleep takes five minutes longer than normal and once asleep people tend to enjoy 30 minutes more REM sleep. The Swiss study’s new results were published in the journal Current Biology.

In that journal paper, the researchers rather remarkably explained, “The aim of exploring the influence of different lunar phases on sleep regulation was never a priori [sic] hypothesized. We just thought of it after a drink in a local bar one evening at full moon.”

That interesting admission led researchers from the University of Gothenburg’s Sahlgrenska Academy to build on the original work reported last year. It must be pointed out that, with the Swiss exception, many studies have previously been undertaken with the goal of either proving or disproving any correlation between sleep and the lunar cycle. One large study at the Max Planck Institute analyzed data from more than 1,000 people over 26,000 nights of sleep, only to arrive at the conclusion that there was no correlation.

This is where Michael Smith and his fellow researchers at Gothenburg step in. The team analyzed data collected in a previous study that focused on a cohort of 47 healthy 18- to 30-year-olds. Their findings, published also in the journal Current Biology, support the theory that a correlation between the lunar cycle and sleep patterns does, in fact, exist. Their paper is entitled ‘Human sleep and cortical reactivity are influenced by lunar phase.’

“Our study generated findings similar to the Swiss project,” Smith says. “Subjects slept an average of 20 minutes less and had more trouble falling asleep during the full moon phase. However, the greatest impact on REM sleep appeared to be during the new moon.”

Much like the Swiss study, Smith and his team began their study looking not at the moon but in a completely different direction. “The purpose of our original study was to examine the way that noise disturbs sleep,” Smith noted. “Re-analysis of our data showed that sensitivity, measured as reactivity of the cerebral cortex, is greatest during the full moon.”

As their findings explain, greater cortical reactivity was found in both men and women. However, only men had a discernible amount of trouble falling asleep when the moon was full. Additionally, men slept less than the women studied. Skeptics in the scientific community are quick to note that the study might be wrought with error due to issues of both age and gender differences. They also highlight more subtle factors like the physical condition of the individual participants and exposure to light during the day may also be variables that were not properly accounted for.

Fully aware of the points made by skeptics in the community, Smith believes the results of his study should be considered for more intensive future study.

“The rooms in our sleep laboratories do not have any windows,” he explains. “So the effect we found cannot be attributable to increased nocturnal light during full moon. Thus, there may be a built-in biological clock that is affected by the moon, similar to the one that regulates the circadian rhythm.”

“But all this is mere speculation – additionally, more highly controlled studies that target these mechanisms are needed before more definitive conclusions can be drawn,” Smith concludes.

—–

GET FIT WITH FITBIT – Fitbit Flex Wireless Activity + Sleep Wristband, Black

Be Prepared! What To Do Before, During And After A Hurricane

Rayshell Clapper for redOrbit.com – Your Universe Online

From June 1 to November 30 of each year, the Atlantic hurricane season flexes its muscles. In an effort to help keep people alive and safe, the Centers for Disease Control and Prevention (CDC) keeps a Hurricane Preparedness website that is full of great information to help with hurricane health and safety. The CDC provides important tips to help before, during, and after a hurricane.

BEFORE

The first tip is to prepare for a hurricane. If you live in a place that could be hit, it is best to prepare now rather than wait for a hurricane to be imminent. Before a hurricane, the CDC identifies two steps: make a plan and get supplies. In making a plan, the CDC gives these readiness suggestions:

• Write down emergency phone numbers and keep them near every phone in the house, pin them to the refrigerator, and program them into cell phones.
• Buy a fire extinguisher and make sure every family member knows where it is and how to use it.
• Find out where the nearest shelter is located. Don’t forget to learn different routes to get there.
• Ensure everyone knows the sound of the warning sirens in the area and what to do when they go off.
• Stock up on emergency supplies.

The second step in preparing for a hurricane consists of getting supplies. These supplies consist of the following according to another CDC hurricane preparedness webpage:

Food and Medicine

• Clean containers for water
• At least 5 gallons of water per person (which should be enough to last 3 to 5 days)
• A 3 to 5 day supply of food that doesn’t go bad (like canned food)
• Baby food or formula
• Prescription medicines

Safety Items

• First aid kit and instructions
• Fire extinguisher
• Battery-powered radio
• Flashlights
• Extra batteries
• Sleeping bags or extra blankets
• Supplies to make drinking water safe (like iodine tablets or chlorine bleach)

Personal Care Products

• Hand sanitizer
• Wet cleaning cloths (like baby wipes) in case you don’t have clean water
• Soap
• Toothpaste
• Tampons and pads
• Diapers

Make sure your supplies are stored together in a place that’s easy to reach.

Make an Emergency Car Kit

In case you need to leave quickly during a hurricane, always keep an emergency kit in your car, too. Make sure you include:

• Food that doesn’t go bad (like canned food)
• Flares
• Jumper cables (sometimes called booster cables)
• Maps
• Tools, like a roadside emergency kit
• first aid kit and instructions
• A fire extinguisher
• Sleeping bags
• Flashlight and extra batteries

—–

BE PREPARED! 4 Person Survival Kit Deluxe for Home, Work or Auto

—–

DURING

Once you have prepared for a hurricane, you need to prepare to evacuate and also know what to do if ordered not to evacuate. If ordered to evacuate, you first need to take only what you really need like cell phones, medications, identification, and cash. Of course, a good idea is to prepare a little box of stuff before the notice to evacuate comes so that it is easier to grab everything you will need. Plus, if there is something sentimental you want to take them, then it should be readily available as well. If you have time before evacuating, try to turn off the gas, electricity, water, and unplug all appliances. Do not do this unless there is enough time. When evacuating, it is also important to make sure to have a car emergency kit. Always follow the roads that the emergency workers recommend in order to stay safe. Do not veer even if there is traffic and only change routes if instructed.

For those ordered not to evacuate, the CDC gives several tips to help people stay safe.

1. Keep listening to the radio, TV, or internet for updates
2. Stay inside until an official message comes through that the hurricane is over.
3. Avoid windows.
4. Be careful.
5. Be ready to leave.

AFTER

After a hurricane, the CDC identifies several tips for safety. First of all, be safe inside by never using electrical devices if they get wet. Use flashlights instead of candles and make sure that all lit candles are watched and not near anything that can catch fire. If any unusual noises start, you should leave your home or building immediately because they may be about to fall. Additionally, the CDC says to prevent carbon monoxide poisoning by not using gas or coal-burning equipment inside, not running vehicles in a garage, not using a gas oven for heat, and leaving the house if the carbon monoxide detector starts beeping.

Beyond safety inside the house after a hurricane, you need to be safe outside as well. The first tip is to keep away from floodwater. Floodwater can be deeper than it looks, so even if driving, you should always go around it. Floodwater often carries germs as well, so be sure to wash up if you get wet. Use soap and water, alcohol-based wipes, or sanitizer. Beyond floodwaters, you should avoid power lines and dangerous materials. Finally, be sure to protect yourself from animals and pests. Wild or stray animals can be dangerous, especially after a storm. Any dead animal should be reported. Floods can also bring mosquitoes that carry disease.

Once the storm is over and the shock has worn off, the final tip from the CDC is to clean up the home if needed. It is important to wear safety gear both for cleaning up debris and for cleaning up mold and bacteria. Disinfect everything, especially toys for children. Finally, pace yourself during clean up because it might be a big job.

Hurricane season happens every year, so it is important to be prepared. Hurricanes are dangerous, so the more prepared you are, the more likely you will stay safe before, during, and after the hurricane.

STEM Education Initiative Website Launched By AAU

Association of American Universities

Press invited to reception, poster session on AAU undergraduate STEM teaching initiative July 21

The Association of American Universities (AAU), an association of leading public and private research universities, today launched the AAU STEM Initiative Hub, a website that will both support and widen the impact of the association’s initiative to improve the quality of undergraduate teaching and learning in science, technology, engineering and mathematics (STEM) fields at its member institutions.

AAU has partnered with HUBzero, a web-based platform for scientific collaboration developed and managed by Purdue University, to create the AAU STEM Initiative Hub. The new website provides an interactive tool for AAU universities to showcase innovative institutional efforts they have undertaken to implement key elements of the Framework for Systemic Change in Undergraduate STEM Teaching and Learning, such as encouraging more interactive teaching practices and influencing departmental cultures to support faculty members who want to improve the quality of their teaching.

The new website will make the university examples accessible not only to AAU universities but also to non-member universities, the broader higher education community, and others engaged in STEM educational transformation, as well as the general public.

“As institutions take steps to improve their use of evidence-based teaching practices, AAU hopes these examples will serve as a resource for all colleges and universities working to improve undergraduate teaching and learning in STEM,” said AAU President Hunter Rawlings.

The Hub will also profile efforts being advanced by AAU’s eight STEM Initiative Project Sites and provide a secure space for AAU STEM Network members to share information about successful strategies and challenges they are facing in improving STEM education. The Hub will help cultivate relationships among those leading reform efforts at AAU universities, providing a forum for ongoing interaction and exchange of information and ideas.

“Our goal is to support and link AAU institutions grappling with similar challenges and barriers in reforming and improving STEM teaching and learning for undergraduate students,” said Rawlings.

To further advance campus-based dialogues on systemic change in undergraduate STEM education, AAU will host an in-person workshop for the AAU STEM Network on July 21-23, 2014. All AAU member universities have been invited to participate in this conference. For a reception on the opening evening, July 21, each participating campus has been asked to present a poster that showcases its own undergraduate STEM education reform efforts relevant to the AAU initiative. The goal of the poster session at the reception, which will also be open to policymakers and the news media, is to provide an opportunity for attendees to learn about work occurring at major research universities to improve the quality of undergraduate teaching and learning in STEM fields.

Researchers Find Missing Piece Of How Birds Sense Light

Institute of Transformative Bio-Molecules (ITbM), Nagoya University

Professor Takashi Yoshimura and colleagues of the Institute of Transformative Bio-Molecules (WPI-ITbM) of Nagoya University have finally found the missing piece in how birds sense light by identifying a deep brain photoreceptor in Japanese quails, in which the receptor directly responds to light and controls seasonal breeding activity. Although it has been known for over 100 years that vertebrates apart from mammals detect light deep inside their brains, the true nature of the key photoreceptor has remained to be a mystery up until now. This study led by Professor Yoshimura has revealed that nerve cells existing deep inside the brains of quails, called cerebrospinal fluid (CSF)-contacting neurons, respond directly to light. His studies also showed that these neurons are involved in detecting the arrival of spring and thus regulates breeding activities in birds. The study published online on July 7, 2014 in Current Biology is expected to contribute to the improvement of production of animals along with the deepening of our understanding on the evolution of eyes and photoreceptors.

Many organisms apart from those living in the tropics use the changes in the length of day (photoperiod) as their calendars to adapt to seasonal changes in the environment. In order to adapt, animals change their physiology and behavior, such as growth, metabolism, immune function and reproductive activity. “The mechanism of seasonal reproduction has been the focus of extensive studies, which is regulated by photoperiod” says Professor Yoshimura, who led the study, “small mammals and birds tend to breed during the spring and summer when the climate is warm and when there is sufficient food to feed their young offspring,” he continues. In order to breed during this particular season, the animals are actually sensing the changes in the seasons based on changes in day length. “We have chosen quails as our targets, as they show rapid and robust photoperiodic responses. They are in the same pheasant family as the roosters and exhibit similar characteristics. It is also worth noting that Toyohashi near Nagoya is the number one producer of quails in Japan,” explains Professor Yoshimura. The reproductive organs of quails remain small in size throughout the year and only develop during the short breeding season, becoming more than 100 times its usual size in just two weeks.

In most mammals including humans, eyes are the exclusive photoreceptor organs. Rhodopsin and rhodopsin family proteins in our eyes detect light and without our eyes, we are unable to detect light. On the other hand, vertebrates apart from mammals receive light directly inside their brains and sense the changes in day length. Therefore, birds for example, are able to detect light even when their eyes are blindfolded. Although this fact has been known for many years, the photoreceptor that undertakes this role had not yet been clarified. “We had already revealed in previous studies reported in 2010 (PNAS) that a photoreceptive protein, Opsin-5 exists in the quail’s hypothalamus in the brain,” says Professor Yoshimura. This Opsin-5 protein was expressed in the CSF-contacting neurons, which protrudes towards the third ventricle of the brain. “However, there was no direct evidence to show that the CSF-contacting neurons were detecting light directly and we decided to look into this,” says Professor Yoshimura.

Yoshimura’s group has used the patch-clamp technique for brain slices in order to investigate the light responses (action potential) of the CSF-contacting neurons. As a result, it was found that the cells were activated upon irradiation of light. “Even when the activities of neurotransmitters were inhibited, the CSF-contacting neurons’ response towards light did not diminish, suggesting that they were directly responding to the light,” says Professor Yoshimura excitedly. In addition, when the RNA interference method was used to inhibit the activity of the Opsin-5 protein expressed in the CSF-contacting neurons, the secretion of the thyroid-stimulating hormone from the pars tuberalis of the pituitary gland was inhibited. The thyroid-stimulating hormone, so-called the “spring calling hormone” stimulates another hormone, which triggers spring breeding in birds. “We have been able to show that the CSF-contacting neurons directly respond to light and are the key photoreceptors that control breeding activity in animals, which is what many biologists have been looking for over 100 years,” elaborates Professor Yoshimura.

There have been many theories on the role of CSF-contacting neurons in response to light. “Our studies have revealed that these neurons are actually the photoreceptors working deep inside the bird’s brain. As eyes are generated as a protrusion of the third ventricle, CSF-contacting neurons expressing Opsin-5, can be considered as an ancestral organ, which shares the same origin as the visual cells of the eyes. Opsin-5 also exists in humans and we believe that this research will contribute to learning how animals regulate their biological clocks and to find effective bio-molecules that can control the sensing of seasons,” says Professor Yoshimura. Professor Yoshimura’s quest to clarify how animals measure the length of time continues.

Bee ‘Shouts’ Warn Intruders That A Food Source Will Be Defended

[ Watch the Video: Stingless Bees Fight Over Food Source ]

April Flowers for redOrbit.com – Your Universe Online

If you were foraging for food in a highly competitive environment and you found a very lucrative source, how would you communicate this prize to your teammates without giving it away to your competitors? This is the situation bees find themselves in quite often.

Scientists believe that many animals, faced with eavesdroppers, have developed “whispers” to prevent revealing the location or quality of their resources. In Brazil, however, some species of bees have learned to “shout.” These shouts warn would-be interlopers that their prime source of food will be fiercely defended. The UC San Diego-led study, published in Current Biology, reveal that this bold and risky communication strategy is remarkably successful.

“It’s a signal with honest aspects and the possibility of lies,” explains James Nieh, a professor of biology at UC San Diego. “It tells nestmates where to find good food and hints at a larger occupying force.”

Nieh collaborated with PhD student Elinor Lichtenberg from his laboratory, who is currently a postdoctoral researcher at Washington State University.

According to Lichtenberg, these counterintuitive bee shouts indicate that eavesdroppers are able to alter the evolution of animal signals in previously unthought-of ways.

“Our study provides a new way of looking at how eavesdroppers affect the evolution of animal communication signals,” she adds. “Until now, it was thought that eavesdroppers select against conspicuous signals, for example by more easily finding and eating prey that sings loudly. But our results show that eavesdroppers can help select for the same conspicuous signals that are easiest for intended recipients to detect and understand.”

Nieh’s research has focused on the evolution of communication strategies among bees. He says that “eavesdropping is part of the information web, the signals and cues that surround animals and play a key role in shaping ecosystems.”

In the case of bees and other pollinators, he says, “a network of signals and cues shapes pollination, informing animals about where and when food is available. Researchers have in general thought about eavesdropping as a force that makes signals less conspicuous, leading to the evolution of ‘whispers’ to counter spying. However, we show that eavesdropping can also lead to ‘shouts.’ In this stingless bee system, with aggressive colonies jockeying for limited resources, more conspicuous food-recruitment signals indicate a higher likelihood that a resource will be harder to wrest away.”

Stingless bees — including two species from the genus Trigona that are able to entice their nestmates to food sources with chemically distinct pheromones — were the focus of the new study. These species all compete with one another for similar food sources. For example, Trigona spinipes foragers will mark new food sources, which are then detected by Trigona hyalinata spies. The T. hyalinata bees will displace the T. spinipes if they can recruit enough nestmates.

Lichtenberg developed a controlled field study that allowed her to observe the eavesdropping behavior. She found that the sites which have been frequently visited by one species, which is communicated by the larger number of pheromones, will be avoided by the eavesdropping species. Sites with lesser pheromones, however, attract the eavesdropping species. By recruiting more of their nestmates or engaging in a battle with the previous claimants, the eavesdroppers would be able to take over the highly visited sites. The risks and energy costs to the eavesdroppers, however, seems to make these strategies not worth the trouble.

The research team used economic models to mimic the eavesdropping bees’ decision-making. They ran three model scenarios: T. hyalinata eavesdropping on T. spinipes; T. spinipes eavesdropping on T. hyalinata; and the non-aggressive Melipona rufiventris eavesdropping on T. spinipes. The team found that all three scenarios’ outcomes matched the eavesdropping behavior measured in this study and in previous work by the team.

“Assembling such a group in the nest after having found a food source through eavesdropping uses time and energy the eavesdropper could otherwise spend looking for an unoccupied food source,” explains Lichtenberg. “If the eavesdropper brings too small a group to an occupied food source and cannot win access to it, she and the bees accompanying her have essentially wasted energy. For attacks between colonies of the same species, there is also a risk that the conflict will escalate to physical interactions in which large numbers of bees may die.”

“Our study is one of the first to investigate what drives the behavior of eavesdroppers collecting information from competitors within the same trophic level, which use the same food resources as the eavesdropper,” she adds. “Previous eavesdropping research has mainly focused on individuals seeking mates, predators looking for prey or prey trying to avoid being eaten. In those cases, eavesdroppers’ expected behavior is clear. This is not true for eavesdropping on competitors.”

Lichtenberg explains that the study findings are not just relevant to the evolution of communication strategies in the animal kingdom, but suggests how such strategies might also affect the ecology of plant communities.

“Such strategies affect not only the individuals directly involved, but also broader ecological interactions between the food-gatherers and their food,” Lichtenberg says. “This is particularly important for animals such as the bees I studied, because their movements determine plant pollination.”

—–

SHOP NOW: Bumble Bees of North America: An Identification Guide (Princeton Field Guides)

Expectant Women Unhappy With Medical Checkups Often Turn To Internet For Pregnancy Advice

April Flowers for redOrbit.com – Your Universe Online

More people are turning to Internet sources for medical advice and information, and that includes pregnant women. A research team from Penn State found that pregnant women are unhappy with how often they turn to online sites with their medical questions. The study findings were published in a recent issue of the Journal of Medical Internet Research.

“We found that first-time moms were upset that their first prenatal visit did not occur until eight weeks into pregnancy,” said Jennifer L. Kraschnewski, assistant professor of medicine and public health sciences, Penn State College of Medicine. “These women reported using Google and other search engines because they had a lot of questions at the beginning of pregnancy, before their first doctor’s appointment.”

The researchers found that many women, even following their first visit to the obstetrician, returned to search engines and social media to find answers to their questions. These women felt like the literature provided by their doctor’s office was insufficient. The team reports that the structure of prenatal care in the US has changed very little in the last 100 years, despite the rapid evolution of technology.

The original focus of the study was to gather information for the development of a smartphone app for women to use during pregnancy. The discovery that a majority of women were dissatisfied with the structure of their prenatal care was an incidental outcome of that research.

Four focus groups, made up of a total of 17 pregnant women over the age of 18 who owned smartphones, were conducted by the research team. Most of the participants agreed that they turned to technology to fill their knowledge gaps about pregnancy because they were dissatisfied with the structure of prenatal visits—saying the visits were not responsive to their individual needs. The women were not satisfied with the questionable accuracy of the online information, either.

The information disseminated by the doctors’ offices were considered to be outdated to the participants. They would prefer to receive their information in different formats than pamphlets, flyers and books. Instead, they would rather watch videos and use social media and pregnancy-tracking apps and websites.

“This research is important because we don’t have a very good handle on what tools pregnant women are using and how they engage with technology,” said Kraschnewski, also an affiliate of the Penn State Institute for Diabetes and Obesity. “We have found that there is a real disconnect between what we’re providing in the office and what the patient wants.”

Kraschnewski stresses that regulation of medical information on the Internet is nearly non-existent. This could become problematic by alarming patients unnecessarily. According to a 2008 study, less than four percent of the millions of websites that surface when searching for common pregnancy terms were created or sponsored by doctors.

“Moving forward, in providing medical care we need to figure out how we can provide valid information to patients,” said Kraschnewski. “We need to find sound resources on the Internet or develop our own sources.”

—–

Shop Amazon – Wearable Technology: Electronics

ATM Skimmers Are Becoming Virtually Invisible

Peter Suciu for redOrbit.com – Your Universe Online

The automated teller machine (ATM) is colloquially known as the “money machine” to many people – and for hackers this is increasingly true thanks to account-stealing bank machine skimmers, which security researchers warn are getting smaller and thinner. Krebs On Security reported on Monday that ATM skimmers are much like other electronic gadgets in the smaller/thinner department as well as having extended battery life.

The European ATM Security Team (EAST) issued a new report on these devices, and it warned that a new form of “mini-skimmer” has made its way to Europe. These are so small that it can be difficult to detect the skimmer, and these can even fit inside a card slot. Moreover, they can be used in conjunction with a tiny camera that can be used to gather a personal identification number (PIN). These cameras are increasingly disguised to resemble part of the ATM fascia or otherwise blend in and look like part of the machine.

This problem likely won’t go away – at least until there is a major upgrade to all ATMs.

“ATM skimmers are still a problem in Europe, even though virtually all cash machines there only accept cards that include so-called ‘chip & PIN’ technology,” Brian Krebs wrote on his Krebs on Security blog. “Chip & PIN, often called EMV (short for Eurocard, MasterCard and Visa), is designed to make cards far more expensive and complicated for thieves to duplicate.

“Unfortunately, the United States is the last of the G-20 nations that has yet to transition to chip & PIN, which means most ATM cards issued in Europe have a magnetic stripe on them for backwards compatibility when customers travel to this country,” Krebs added. “Naturally, ATM hackers in Europe will ship the stolen card data over to thieves here in the U.S., who then can encode the stolen card data onto fresh (chipless) cards and pull cash out of the machines here and in Latin America.”

“In countries where the ATM EMV rollout has been completed most losses have migrated away from Europe and are mainly seen in the USA, Asia-Pacific, and Latin America,” the EAST report noted. “From the perspective of European card issuers the Asia-Pacific region seems to be eclipsing Latin America for such losses.”

There is some good news in that some former ATM skimmer designers are now working to fight against the criminal technology. Last year a convicted Romanian hacker named Valentin Bonata, who is serving a five year sentence, developed a card reader that could prevent skimming. Bonata’s technology involves a rotating card reader that he claims would prevent skimmers from being able to lock on to the magnetic data strip.

However, EAST and Krebs noted there is also a simple and low tech method to protecting yourself from an ATM skimmer. That simply involves covering the PIN pad when entering the digits. Krebs has recommended this as a handy way to foil ATM skimmer scams but found that most users don’t take this precaution even when using a walk-up ATM. Beyond fears of skimmers Krebs added, “It’s a good idea to visit only ATMs that are in well-lit and public areas, and to be aware of your surroundings as you approach the cash machine. If you visit a cash machine that looks strange, tampered with, or out of place, then try to find another ATM.”

—–

PROTECT YOURSELF TODAY – Norton Antivirus

Important Clue To Cancer Provided By DNA Origami Nano-tool

Karolinska Institutet
Researchers at Karolinska Institutet in Sweden have headed a study that has provided new knowledge about the EphA2 receptor, which is significant in several forms of cancer. This is important knowledge in itself – but just as important is how this study, which is published in the highly respected journal Nature Methods today, was conducted. The researchers used the method of DNA origami, in which a DNA molecule is shaped into a nanostructure, and used these structures to test theories about cell signalling.
It was previously known that the EphA2 receptor played a part in several forms of cancer, such as breast cancer. The ligand, i.e., the protein that communicates with the receptor, is known as an ephrin molecule. Researchers have had a hypothesis that the distance between different ligands – in this case the distance between ephrin molecules – affects the level of activity in the communicating receptor of the adjacent cells.
The Swedish researchers set out to test this hypothesis. They used DNA building blocks to form a stable rod. This has then been used as a very accurate measure of the distance between molecules.
“We use DNA as the construction material for a tool that we can experiment with”, says Björn Högberg, principal investigator at the Department of Neuroscience. “The genetic code of the DNA in these structures is less important in this case.”
The researchers attached proteins, or ephrins, to the DNA rod at various intervals, for example 40 or 100 nanometers apart. The DNA rods were then placed in a solution containing breast cancer cells. In the next step, the researchers looked at how active EphA2 was in these cancer cells.
It turned out that if the ephrin molecules were placed close together on the DNA rod, the receptor in question became more active in the cancer cells, and the cells also became less invasive in respect of the surrounding cells, which could be an indication that they became less prone to metastasis. This was true even though the amount of protein was the same throughout the experiments, i.e., the number of attached molecules remained the same.
“For the very first time, we have been able to prove this hypothesis: the activity of EphA2 is influenced by how closely spaced the ligands are on the surrounding cells,” says Björn Högberg. “This is an important result in itself, but the point of our study is also that we have developed a method for examining how cells react to nearby cells in a controlled environment, using our custom DNA nano-calipers.”
The researchers describe the cell communication as a form of Braille, where the cells somehow sense the protein patterns of nearby cells, and where the important thing is not only the amount of proteins, but to a great extent the distance between them as well. This study found that a cluster of proteins would communicate more actively than sparsely spaced proteins, even if the concentration was the same.
“This is a model that can help us learn more about the importance of the spatial organization of proteins in the cell membrane to how cells communicate with each other, something that will hopefully pave the way for a brand new approach to pharmaceuticals in the long term,” says Ana Teixeira, a principal investigator at the Department of Cell and Molecular Biology . “Today, the function of the pharmaceuticals is often to completely block proteins or receptors, but it is possible that we should rather look at the proteins in their biological context, where the clustering and placement of various proteins are relevant factors for the effect of a drug. This is probably an area where there is important knowledge to obtain, and this is a way of doing it.”
The study is financed by funds from the Swedish Research Council, the Strategic Research Program in Stem Cell Research and Regenerative Medicine (StratRegen) at Karolinska Institutet, Vinnova, Carl Bennet AB.

New Tool Tells You How Much Alcohol Is In Your Favorite Cocktail

Alan McStravick for redOrbit.com – Your Universe online

Last month, redOrbit reported on the breakdown of alcohol consumption on a state-by-state and city-by-city level and discovered some very interesting and unexpected statistics. That study was based on data collected from the personal breathalyzer product put out by San Francisco-based BACTrack.

That study was only possible because each participant, individuals who had purchased the BACTrack product and linked it to their geo-enabled cellphone, had already consumed alcoholic beverages. A new alcohol calculation tool has recently been unveiled by the National Institute of Alcohol Abuse and Alcoholism (NIAAA), part of the National Institutes of Health (NIH), which aims to educate drinkers on the alcoholic content and strengths of their preferred beverages before imbibing.

India’s Zee News, reporting on the new effort by the NIAAA, pointed out the calculator is “supposed to be used as an educational guide only, and consumers should be aware that bars and restaurants will have slightly different recipes for making the cocktails that the calculator presents.” This is important to keep in mind for the frequent or even casual drinker who repeatedly visits the same watering hole. Chances are if your bartender recognizes you they pour with a heavier hand.

According to the NIAAA’s own site, everyone going out for happy hour or a night out on the town are urged to practice restraint, limiting themselves to no more than one standard drink per hour. “Note that it takes about two hours for the adult body to completely break down a single drink,” the site cautions. “Stay within low-risk levels: For men, no more than four standard drinks on any day (and 14 per week), and for women, no more than three on any day (and 7 per week).”

Why the disparity between consumption models for men and women? Well, according to drinksmarter.org, the physical differences between men and women are significant enough that alcohol affects women far faster than it does men. For instance, on average, healthy women have approximately 10 percent more body fat than men. Alcohol cannot be absorbed by body fat which means the effects of alcohol are considerably more concentrated as they surge through a typically smaller system.

While arguments could certainly be made for the deleterious health effects of alcohol consumption, such as liver damage, weight gain, long-term brain impairment and depression among many others, perhaps one very important benefit from better understanding the judgment-impairing beverage you are about to consume would be a decrease in the amount of drinkers who choose to drive drunk or even mildly impaired. According to MADD.org, statistics for the most recent period available showed a national increase in drunk driving deaths on American roads and highways between 2011 and 2012 of 4.6 percent.

The NIAAA developed their alcohol content calculation tool, I’m certain, with the best of intentions. However, after toying with it for just a few moments, it really felt more a novelty than a tool that will be able to effectively educate drinkers to modify their habits and moderate their overall intake. Perhaps information gleaned from the tool will help those who were previously unaware of just how much they were imbibing.

—–

MAKE YOUR OWN SODA: SodaStream Fountain Jet Home Soda Maker Starter Kit, Black and Silver

Bitcoin Raid Nets French Police With $272,000 In Illegal Virtual Money

Peter Suciu for redOrbit.com – Your Universe Online

On Monday it was announced that French law enforcement have dismantled an illegal Bitcoin exchange and seized 388 virtual currency units, according to the Reuters news agency. The seized Bitcoins had a reported value of 200,000 euros, or $272,700.

This was reportedly the first such law enforcement operation conducted in Europe to investigate illegal selling of the virtual currency.

Two people in the Riviera coastal cities of Cannes and Nice were under a formal investigation that took place on Friday. They were both detained on suspicion of operating a website that illegally sold and even lent Bitcoins to users. Yahoo News reported that a home believed to belong to one of the suspects was raided, and authorities found a portfolio of Bitcoins valued at 9,000 euros each. In addition, credit cards and computer hardware was also reportedly seized as part of the investigation.

“It’s the first time in Europe that a judicial action has resulted in the closure of an illegal exchange for virtual currency,” Olivier Caracotch, prosecutor in the southwestern town of Foix where the investigation started, told Reuters on Monday. “It’s also the first time in France that Bitcoins have been seized as part of a judicial procedure.”

Reuters reported that the police in France were tipped off to the existence of the platform by a retired policeman. He had alerted financial investigators after he had purchased Bitcoins on the site. The two suspects were also reportedly being investigated on potential charges of illegal banking, money laundering and operating an illegal gambling website.

This is far from the only controversy for the virtual currency.

The MtGox Bitcoin exchange in Japan was recently liquidated, and earlier this year Russian authorities ruled that the use of the Bitcoin currency was illegal. The Russian Prosecutor General’s Office released a statement in February that stated that the digital currency “cannot be used by citizens and legal entities.”

In December, the value of Bitcoins plummeted dramatically on news that China would no longer accept the digital currency. In February a known Bitcoin flaw also resulted in some $2.6 million being stolen from the new Silk Road site. Approximately 4,400 Bitcoins were taken from the site’s escrow account by hackers.

Yahoo News also reported that a new blog associated with the Islamic State in Iraq and Syria (ISIS) had claimed that it would use Bitcoin to further its efforts. The group, which is now engaged in a military campaign in Iraq, announced that Bitcoin would “enable jihad on a large scale” and allow the sending of “millions of dollars” to jihadists.

Yahoo News cited a blog post titled “Bitcoin and the Charity of Violent Physical Struggle,” in which the author argued that such donations would even be “untrackable” by Western governments.

There is also the ongoing issue of who is actually the creator of Bitcoin – a key fact of the virtual currency that very much remains a mystery. Researchers debated this issue in April and pointed to Nick Szabo, a blogger and “alleged” former law professor at George Washington University.

Despite the controversies, however, the currency still apparently has its supporters. According to a May report, Canada lead the world in Bitcoin accessibility and security.

NUS Researchers Discover Novel Protein Complex With Potential To Combat Gastric Cancer Caused By Bacterial Infection

NUS

New study by Cancer Science Institute of Singapore at NUS show that the protein is produced as an immune response and requires the tumour suppressor RUNX3

A team of scientists from the Cancer Science Institute of Singapore (CSI Singapore) at the National University of Singapore (NUS) discovered that a protein named IL23A is part of our stomach’s defense against bacterial infection which leads to gastric cancer. This finding could potentially be used to combat the deadly disease.

The research group, led by Professor Yoshiaki Ito, Senior Principal Investigator at CSI Singapore, also showed that the production of IL23A by stomach cells requires the tumour suppressor gene, RUNX3, which is frequently silenced in gastric cancer. The novel study was first published online in the leading journal Cell Reports on 4 July 2014.

Every year, some 740,000 people die from gastric cancer globally. In Singapore, stomach cancer was the fourth leading cause of cancer death among men and fifth among women from 2008 to 2012. This highlights the urgent need to understand the causes of this deadly disease. It is now known that a major trigger for the development of gastric cancer is the infection of a bacterium, Helicobacter pylori. A drawn-out battle against this bacterium often causes persistent inflammation of the stomach, which is a dangerous cancer-causing condition. As such, understanding and strengthening the natural defence mounted by our stomach cells is a logical and crucial step in preventing gastric cancer.

In this study, the researchers found that a protein named IL23A normally produced only by white blood cells is released by stomach cells when exposed to H. pylori. This demonstrates that IL23A is part of our stomach’s defence against bacterial infection. At the same time, they found that production of IL23A by stomach cells requires RUNX3, a tumour suppressor gene that is frequently silenced in gastric cancer. The involvement of RUNX3 appears particularly important during inflammation and infection caused by the bacterium. Their observation makes an important connection: when stomach cells lose RUNX3, they become defective in their ability to respond to infection by H. pylori, making the stomach vulnerable to this carcinogenic bacterium.

Prof Ito said, “We have been studying the frequent inactivation of RUNX3 in gastric cancer for some time. Our latest findings show that an important consequence of losing RUNX3 is a compromised immunity and this will enable us to devise better protection against pathogens.”

The study was supported by the Translational and Clinical Research Flagship Programme and the National Research Foundation of Singapore.

In the next phase of research, the team will focus on determining the function of the novel IL23A complex they discovered, as part of their overall strategy to combat H. pylori-induced gastric cancer. Concomitant application for patent protection of this work by NUS will facilitate industry involvement, so that this research can be effectively used to develop novel treatments to combat gastric cancer.

Obesity And Large Waist Size Are Risk Factors For COPD

Canadian Medical Association Journal

Obesity, especially excessive belly fat, is a risk factor for chronic obstructive pulmonary disease (COPD), according to an article in the Canadian Medical Association Journal (CMAJ).

Excessive belly fat and low physical activity are linked to progression of the disease in people with COPD, but it is not known whether these modifiable factors are linked to new cases.

A team of researchers in Germany and the United States looked at the relationship of waist and hip circumference, body mass index (BMI) and physical activity levels to new cases of COPD in a large group of men and women in the US. They looked at data on 113,279 people between the ages of 50 and 70 years who did not have COPD, cancer or heart disease at the beginning of the study (1995). During the 10-year follow-up period, COPD developed in 3648 people. People with large waist circumference (110 cm or over in women and 118 cm or over in men) had a 72% increased risk of COPD.

“We observed a stronger positive relation with abdominal body fat than with total body fat and COPD,” writes Dr. Gundula Behrens, Department of Epidemiology and Preventive Medicine, University of Regensburg, Regensburg, Germany, with coauthors. “In particular, overweight as measured by BMI emerged as a significant predictor of increased risk of COPD only among those with a large waist circumference.”

A large waist was a robust predictor of COPD in smokers as well as in people who had never smoked.

Pollution, smoking and toxic particles in workplace dust are thought to cause COPD through chronic inflammation and impaired ability to heal injury to the lungs. “Increased local, abdominal and overall fat depots increase local and systemic inflammation, thus potentially stimulating COPD-related processes in the lung,” write the authors.

People with a large hip circumference and who were physically active at least 5 times a week were 29% less likely to experience COPD. Exercise can reduce inflammation, oxidative stress and enhance healing.

Underweight people had a 56% increased risk of COPD. Possible reasons include malnutrition and reduced muscle mass leading to increased COPD susceptibility and progression through inflammatory processes and impaired lung repair capacity.

“Our findings suggest that next to smoking cessation and the prevention of smoking initiation, meeting guidelines for body weight, body shape and physical activity level may represent important individual and public health opportunities to decrease the risk of COPD. Physicians should encourage their patients to adhere to these guidelines as a means of preventing chronic diseases in general and possibly COPD in particular,” conclude the authors.

—–

GET FIT WITH FITBIT – Fitbit Flex Wireless Activity + Sleep Wristband, Black

Babies Born To Healthy Mothers Are Similar In Size Worldwide

University of Oxford

Poor nutrition and health, not race or ethnicity, cause most of the current wide disparities in fetal growth and newborn size

Babies’ growth in the womb and their size at birth, especially their length, are strikingly similar the world over – when babies are born to healthy, well-educated and well-nourished mothers.

That’s the finding of a landmark international study, INTERGROWTH-21st, led by Oxford University researchers, which involved almost 60,000 pregnancies in eight defined urban areas in Brazil, China, India, Italy, Kenya, Oman, the UK and USA.

Worldwide there are wide disparities in the average size of babies at birth. This has significant consequences for future health, as small for gestational age babies who are already undernourished at birth often face severe short- and long-term health consequences.

It has previously been suggested that ‘race’ and ‘ethnicity’ are largely responsible for differences in the size of babies born in different populations and countries. These new results show that race and ethnicity are not the primary factors. What matters more is the educational, health and nutritional status of the mothers, and care provided during pregnancy.

The researchers carried out ultrasound scans from early pregnancy to delivery to measure babies’ bone growth in the womb, using identical methods in all countries and the same ultrasound machines provided by Philips Healthcare. They also measured the length and head circumference of all babies at birth.

They have demonstrated that if mothers’ educational, health and nutritional status and care during pregnancy are equally good, babies will have equal chances of healthy growth in the womb and future good health.

The researchers report their findings in The Lancet, Diabetes & Endocrinology. They were funded by the Bill & Melinda Gates Foundation.

“Currently we are not all equal at birth. But we can be,” said the lead author Professor Jose Villar of the Nuffield Department of Obstetrics & Gynaecology, University of Oxford. “We can create a similar start for all by making sure mothers are well educated and nourished, by treating infection and by providing adequate antenatal care.

“Don’t tell us nothing can be done. Don’t say that women in some parts of the world have small children because they are predestined to do so. It’s simply not true.”

Key points

  • The study involved almost 60,000 pregnancies in eight defined urban areas in Brazil, China, India, Italy, Kenya, Oman, the UK and USA.
  • Babies’ bone growth in the womb and their length and head circumference at birth are strikingly similar the world over – when babies are born to educated, healthy and well-nourished mothers.
  • Overall, no more than 4% of the total difference in fetal growth and birth size could be attributed to differences between the eight populations in the study.
  • Improving the education, health and nutrition of mothers everywhere will boost the health of their babies throughout life within the next generation.
  • Results are in complete agreement with the previous WHO study using the same methodology from birth to 5 years of age.

In 2010, an estimated 32.4 million babies were born already undernourished in low- and middle-income countries, which represents 27% of all live births globally. This is closely associated with illness and death in infancy and childhood. Small size at birth has an impact on adult health too, with increased risks of diabetes, high blood pressure and cardiovascular disease. Smaller babies also result in substantial costs for health services and a significant economic burden on societies as a whole.

Part of the problem in starting to improve pregnancy outcomes is that fetal growth and newborn size are currently evaluated in clinics around the world using at least 100 different growth charts. In other words, there are no international standards at present for the fetus and newborn, while such standards do exist for infants and children.

“This is very confusing for doctors and mothers and makes no biological sense. How can a fetus or a newborn be judged small in one clinic or hospital and treated accordingly, only for the mother to go to another city or country, and be told that her baby is growing normally,” said Professor Stephen Kennedy, University of Oxford, one of the senior authors of the paper.

The final aim of the INTERGROWTH-21st study is to construct international standards describing optimal growth of a baby in the womb and as a newborn – standards to reflect how a baby should grow when mothers have adequate health, nutrition and socioeconomic status.

The researchers adopted the same approach taken by the WHO’s Multicentre Growth Reference Study of healthy infants and children, which established international growth standards from birth to 5 years of age that are now used in more than 140 countries worldwide.

The INTERGROWTH-21st results fit perfectly with the existing WHO standards for infants. The mean length at birth of the newborns in the INTERGROWTH-21st study was 49.4 ± 1.9 cm, compared with 49.5 ±1.9 cm in the WHO infant study.

From now on international standards can be used worldwide to make judgements on growth and size from conception to 5 years. “Just think, if your cholesterol or your blood pressure are high, they are high regardless of where you live. Why should the same not apply to growth?” said Professor Villar.

Professor Ruyan Pang, from Peking University, China, one of the study’s lead investigators, said: “The INTERGROWTH-21st results fit perfectly with the existing WHO Infant and Child Growth Standards. Having international standards of optimal growth from conception to 5 years of age that everyone in the world can use means it will now be possible to evaluate improvements in health and nutrition using the same yardstick.”

Professor Zulfiqar Bhutta, from The Aga Khan University, Karachi, Pakistan and the Hospital for Sick Children, Toronto, Canada, who is the Chair of the Steering Committee of this global research team, says: “The fact that when mothers are in good health, babies grow in the womb in very similar ways the world over is a tremendously positive message of hope for all women and their families. But there is a challenge as well. There are implications in terms of the way we think about public health: This is about the health and life chances of future citizens everywhere on the planet. All those who are responsible for health care will have to think about providing the best possible maternal and child health.”

Do Kids’ TV Shows Glorify The Consumption Of Unhealthy Food And Drinks?

redOrbit Staff & Wire Reports – Your Universe Online

Unhealthy foods and beverages are regularly portrayed in a positive light on children’s television programming in the UK, according to new research currently appearing online in the journal Archives of Disease in Childhood.

As part of their research, experts from The Children’s Ark at University Hospital Limerick, National Children’s Research Centre, the Centre for Interventions in Infection, Inflammation & Immunity at Graduate Entry Medical School, the University of Limerick, and Dalhousie University looked at how different types of food and drinks were portrayed in weekday children’s television programs aired on two UK television stations in 2010.

While legislation governing the broadcast of TV advertisements for products high in sugar and fat have been implemented in both England and Ireland, those regulations do not apply to the content of the actual programs, according to the study authors. So the investigators assessed the frequency and type of food and drink product portrayals in kids programs aired on the commercial-free BBC and RTE TV stations.

In all, the authors recorded 1,155 food and beverage mentions in 82.5 hours of child-specific television broadcasting on those two stations, meaning that they accounted for 4.8-percent of the total televised material. Each cue lasted an average of 13.2 seconds, and just under 40-percent of the content was American in nature.

The authors then analyzed the mentions based on type of product, product placement, product use, motivation, outcome and characters involved. They found that sweet snacks were the most frequently mentioned type of food (13.3 percent), followed by candy/confectionery (11.4 percent). Tea and coffee was the type of beverage mentioned most often (13.5 percent), followed by sugar-sweetened beverages (13 percent).

Nearly half of all food cues were for unhealthy foods (47.5 percent), while one-quarter of all drink mentions were for sweetened beverages, the study authors said. Nearly all of the mentions involved a major program character, with 95 percent of them “goodies” or heroic characters and 90 percent of whom were not overweight.

The cue was presented positively 32.6 percent of the time, negatively 19.8 percent of the time, and neutrally in 47.5 percent of the time, they added. The motivation associated with each mention were social or celebratory 25.2 percent of the time, related to hunger or thirst in 25 percent of the time, and health-related just two percent of the time.

The researchers also directly compared the content of the UK and Irish programs spanning 27.5 recorded hours of shows, and found that the BBC tended to include more food and drink cues than RTE (2.3 hours to 45.6 minutes).

“Comparison of UK and Irish placements showed both to portray high levels of unhealthy food cues. However, placements for sugar-sweetened beverages were relatively low on both channels,” the study authors wrote.

“This study provides further evidence of the prominence of unhealthy foods in children’s programming,” they added. “These data may provide guidance for healthcare professionals, regulators and program makers in planning for a healthier portrayal of food and beverage in children’s television.”

While it is generally accepted that there is a link between the advertising of unhealthy food and drinks to young children and the consumption of those products by the same kids, the study authors note that the impact of unhealthy food and drink content in TV shows aimed at youngsters remains unclear.

However, they note that based on their observations, it appears as though “eating and drinking are common activities within children specific programming, with unhealthy foods and beverages especially common and frequently associated with positive motivating factors, and seldom seen with negative outcomes.”

—–

GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now

New Study Explains Why High-Protein Diets Can Help People Lose Weight

redOrbit Staff & Wire Reports – Your Universe Online

Eating foods that are high in protein can be a more effective way to lose weight than the more traditional calorie-counting method, according to research presented this week at the Annual Meeting of the Society for Experimental Biology in Manchester, England.

While the nutritional values of food are typically given in standard energy units such as kilojoules or kilocalories, University of Sydney nutritional ecologist Professor David Raubenheimer and his colleagues argue that this method is too simplistic, and that more emphasis needs to be placed on the role of macronutrients.

Those macronutrients (carbohydrates, fats and proteins) typically interact in order to regulate appetite and energy intake, the researchers explained Thursday in a statement. In apes and monkeys, the study authors found that the correct balance of nutrients was more important than the overall energy intake of the creatures.

“Foods are complex mixtures of nutrients and these do not act independently but interact with one another. The appetite systems for different nutrients compete in their influence on feeding,” said Raubenheimer.

Eating nutritionally-balanced foods prevent competition in those appetite systems, meaning that when one nutrient requirement is fulfilled, the others are as well. However, many types of foods do not have the correct balance, possessing a protein-to-carbohydrate ratio that is too high or too low. As a result, for an animal to obtain the right amount of muscle-building protein, they would have to consume too many or too few fats or carbohydrates.

As part of their research, Raubenheimer and his fellow investigators monitored baboons living near human settlements. They found that, despite eating different food combinations each day, the creatures maintained a consistent balance in which one-fifth of their overall energy needs came from protein. However, their total energy intake varied greatly – in fact, the energy intake range could be five times greater on occasion.

The professor said that the findings suggest that the baboon places a higher value on maintaining the right balance of micronutrients over the total energy intake. Previous research had demonstrated that other primates also tend to forage to maintain a balanced diet, but noted that they prioritized protein over fat or carbohydrates when seasonal unavailability of some food types made it impossible to consume regular levels of all three.

However, gorillas have the opposite approach, tending to consume significantly higher amounts of protein in order to maintain their target carbohydrate consumption levels, Raubenheimer said. He added that this demonstrates “that there is diversity even among closely related primates. It also demonstrates that an energy-only approach is not adequate to understand primate foraging or for making conservation decisions.”

Humans typically behave like spider monkeys and orangutans, tending to prioritize protein over the other types of nutrients. As a result, if people eat a low-protein diet, they will over-eat fats and carbohydrates in order to ensure that they reach their protein consumption targets. The study authors believe that this could help explain the rapid rise in obesity levels in the Western World, despite the decrease of protein consumption, over the past six decades.

“We can use this information to help manage and prevent obesity, through ensuring that the diets we eat have a sufficient level of protein to satisfy our appetite,” said Raubenheimer. While these findings could explain why high-protein diets can help people lose weight, the professor cautions that people need to get the right balance of fats and carbohydrates as well – otherwise, those “imbalanced” could result in “other health problems.”

“A simple rule for healthy eating is to avoid processed foods – the closer to real foods the better,” he added. “Whilst it is clear that humans are generalist feeders, no human population has until recently encountered ‘ultra-processed foods’ – made from industrially extracted sugars, starches and salt. Our bodies and appetites are not adapted to biscuits, cakes, pizzas & sugary drinks and we eat too much of them at our peril.”

—–

SHOP NOW: GoWISE USA Slim Digital Bathroom Scale – Measures Weight, Body Fat, Water, & Bone Mass 400 Lbs Capacity Tempered Glass (Black)

Bang Away, Doctors Tell Headbangers, But Beware Of Bleeding On The Brain

Alan McStravick for redOrbit.com – Your Universe Online

It’s official: English heavy metal rock band Motorhead has been declared one of the most hardcore rock and roll acts on Earth according to neurosurgical researcher Dr. Ariyan Pirayesh Islamian of Germany’s Hannover Medical School.

The connection between Dr. Islamian’s expertise in hard rock and neurosurgical research might not at first be evident. But it was a fateful visit by a 50-year-old patient in January 2013 that began to tie the two worlds together.

The study centers on an unidentified German man who finally relented and visited the Medical Center at Hannover after a persistent, two week long headache that affected his entire head. A CT scan revealed a subdural hematoma, also known as ‘bleeding on the brain.’

After completing a full history of the patient, who had no previous head injuries or substance abuse issues, the doctors finally determined the cause was directly related to his penchant for wildly and violently thrusting his head back and forth to music with a hard and driving beat. This act is more popularly known as headbanging. And as it turned out, this man had just attended a Motorhead concert with his son where, it was determined, he had engaged in quite a lot of headbanging.

As reported by the Huffington Post, doctors at Hannover had to drill a hole into the patient’s skull in order to remove a clot before draining the brain over a period of six days. The patient was later released and has since made a full recovery. In a follow up scan, the doctors noted the presence of a benign cyst in the man’s brain that likely increased his vulnerability to the traumatic brain injury. Despite the potential severity of this situation, doctors claim other fellow headbangers can go as wild as they like.

“There are probably other higher risk events going on at rock concerts than headbanging,” Dr. Colin Shieff, neurosurgeon and trustee of the British brain injury advocacy group Headway told the Associated Press. “Most people who go to music festivals and jump up and down while shaking their heads don’t end up in the hands of a neurosurgeon.”

Echoing Dr. Shieff’s sentiment, Dr. Islamian told the CBC, “We are not smartasses who advise against headbanging.” Continuing he said, “Our purpose was not only to entertain the readership with a quite comical case report on complications of headbanging that confirms the reputation of Motorhead as undoubtedly one of the hardest rock ‘n’ roll bands on Earth, but to sensitize the medical world for a certain subgroup of fans that may be endangered when indulging themselves in excessive headbanging.”

After having checked available medical literature, the study group claimed they had found only three cases of subdural hematoma attributed to headbanging. This fact underlies the rarity of the complication.

Dr. Islamian believes that even though subdural hematomae are relatively rare the incidence rate among headbangers may be higher, though they likely go unreported. This is because the symptoms are clinically silent or cause only mild, temporary headaches.

The study authors highlight the potential dangers surrounding headbanging in a Case Report published in The Lancet.

—–

Motörhead – Aftershock: Tour Edition

Most People Would Rather Do Anything At All Than Sit In Silence

Alan McStravick for redOrbit.com – Your Universe online
What good is sitting alone in your room? If a new psychological study by researchers from the University of Virginia (UVa) and Harvard University is to be believed, it’s not good at all. Well, at the very least, it certainly isn’t enjoyable.
Interestingly, the team found participants much preferred the presence of a distraction, even going so far as to opt for hurting themselves rather than to sit quietly alone with only their thoughts. The findings of this study will be published today in the journal Science.
Over 11 associated studies, Uva psychologist and lead author on the study, Timothy Wilson along with fellow Harvard and Uva colleagues tested subjects across a very broad age range and in multiple environments. Across almost every variable it was determined that people don’t like to simply ponder, think or daydream. Instead, participants preferred doing external activities such as listening to music or using a smartphone. Still others opted to give themselves mild electric shocks rather than sit in silence for a short time.
“Those of us who enjoy some down time to just think likely find the results of this study surprising – I certainly do – but our study participants consistently demonstrated that they would rather have something to do than to have nothing other than their thoughts for even a fairly brief period of time,” Wilson said.
In the studies, participants were asked to sit and do nothing for periods of time ranging from six to 15 minutes. The early studies utilized college age students who reported the experience as being highly unenjoyable and that it was hard to concentrate. It was then the team cast a wider net and conducted next studies with participants ranging in age from 18 to 77. The results of both groupings provided essentially the same findings.
“That was surprising – that even older people did not show any particular fondness for being alone thinking,” Wilson said.
Many reading this might think this avoidance of undisturbed alone time might be a symptom of our tech-gadget friendly and attention-span sapped modern society. However, Wilson believes the ready use of smartphones might actually be a response to what he believes is an innate desire by many to always have something to do.
Wilson’s assertion is backed up by previous broad surveys that have shown people really prefer not to disengage from the world. This could likely be attributable to the extended amounts of time people expend watching television, socializing or reading. In those previous surveys, very few people allocated any time to simply relaxing or thinking.
It might seem like a no-brainer that someone sitting in a room in a laboratory might feel self-conscious during a period meant to be dedicated to silent mindfulness. However, the results were repeated even when participants were instructed to sit and think in their own homes.
“We found that about a third admitted that they had ‘cheated’ at home by engaging in some activity, such as listening to music or using a cell phone, or leaving their chair,” Wilson said. “And they didn’t enjoy this experience any more at home than at the lab.”
The team also conducted another experiment where participants were randomly assigned the task of sitting only with their thoughts or spending the same amount of time doing something, though making sure they were not communicating with anyone during the assigned time. The group that was allowed to read or listen to music reported a higher satisfaction with the time than their thinking counterparts and even reported they had a far easier time maintaining concentration.
Building even further on their study, the team next posited whether participants would rather sit with their thoughts in silence or engage in an activity far less pleasant than listening to music or reading a magazine.
Given the option of administering a mild electric shock to themselves by pushing a button or allowing themselves to sit and think for a short period, many participants opted for the electric shock.
The breakdown in gender for this portion of the study was interesting. According to the researchers, 12 of the 18 men in the study chose to give themselves at least one electric shock during the 15-minute “thinking” period. Of the 24 female participants, six opted for the shock. Prior to the study beginning, each participant had received a sample of the shock and reported that they would pay to avoid being shocked again.
“What is striking,” the investigators write, “is that simply being alone with their own thoughts for 15 minutes was apparently so aversive that it drove many participants to self-administer an electric shock that they had earlier said they would pay to avoid.”
As an explanation for why two-thirds of male participants, compared to one-fourth of female participants, chose to self-administer a shock, Wilson and his team explained that men tend to seek “sensations” more than women.
The team next intends to explore the exact reasoning why people find it difficult to be alone with their own thoughts. As Wilson explains, everyone enjoys daydreaming or fantasizing at times but that enjoyment is most likely a result of it having occurred spontaneously.
“The mind is designed to engage with the world,” he said. “Even when we are by ourselves, our focus usually is on the outside world. And without training in meditation or thought-control techniques, which still are difficult, most people would prefer to engage in external activities.”
—–
GET PRIMED! Join Amazon Prime – Watch Over 40,000 Movies & TV Shows Anytime – Start Free Trial Now

Unique Leg Bone Structure Helps Keep Giraffes On Their Feet

Alan McStravick and Gerard LeBlond for redOrbit.com – Your Universe Online

As one of the largest purveyors of children’s games and toys, Toys ‘Я’ Us saw the value in choosing one of the more uniquely designed creatures in the animal kingdom, the giraffe, as their spokesanimal. Noted for long legs and a long neck, the giraffe is the tallest land dwelling animal on Earth and it’s different but docile appearance has long piqued the attentions of children worldwide. Now it appears the attentions of the scientific community have also landed on the tan-speckled beast.

Weighing in at an approximate 2,200 pounds, a group of researchers based out of the Structure and Motion Lab  at the Royal Veterinary College in London wondered how the long, thin legs of the giraffe were able to support the rather robust torso of the animal. By studying the ligament structure within the legs, it appears the team found their answer.

“Giraffes are heavy animals (around 1000 kg), but have unusually skinny limb bones for an animal of this size. This means their leg bones are under high levels of mechanical stress,” lead investigator Chris Basu, of the Royal Veterinary College, explains in a statement. Speaking with the BBC Basu went on to state, “I’m interested in how giraffe have evolved from their modestly-proportioned ancestors to these bizarrely long-necked, long-legged animals that we see today.”

The relatively diminutive okapi, the smaller antelope-like creature long believed to be the early evolutionary progenitor of the giraffe, could now definitively be named as an early ancestor thanks to this research. “I’d like to link modern giraffes with fossil specimens to illustrate the process of evolution,” Basu explained. “We hypothesize that the suspensory ligament has allowed giraffes to reach large sizes that they otherwise would not have been able to achieve.”

The giraffe’s bones, which are similar to the human metatarsal or foot bone and the metacarpal or hand bone, are elongated and make up about half the total length of the leg. Along the length of these bones are grooves which contain the suspensory ligament alluded to above by Basu. Other large animals such as horses also have this type of leg structure, but this was the first time the giraffe has been studied.

This structure could explain why the giraffe’s legs can hold up their immense weight without collapsing. “It turns out,” according to professor John Hutchinson from the Royal Veterinary College, “that the suspensory ligament plays an important role.”

To test the theory, researchers accepted donated limbs from European zoos. The legs came from already deceased giraffes who had died naturally or been euthanized for unrelated circumstances. The researchers placed the limbs securely in rigid frame and by using a hydraulic press applied about 560 pounds of force to each limb to simulate the bodyweight of the giraffe. The legs remained upright and stable even under added force. The suspensory ligament is a tissue, not a muscle, so its function is for support and cannot generate any force on its own.

Relying on the suspensory ligament, rather than muscle, means this structure allows the giraffe to expend far less energy as a result. This then reduces fatigue on the animal, allowing for more efficient transport of it’s massively disproportionate frame. The team also learned that the suspensory ligament structure is an important feature in the giraffe leg for preventing damage to the feet by working to prevent overextension of the animal’s foot joints.

The majestic and graceful nature of the wild giraffe appears contradictory when you consider that its weight rivals that of some elephants and other large creatures. However, thanks to this novel research into the leg structure of these animals, we have learned it will take a lot more than skinny legs and knobby knees to keep a good giraffe down.

—–

Everything you need for your pet – Pet Supplies

Your Coping Mechanisms For Stress Determine Risk Of Developing Insomnia

Alan McStravick for redOrbit.com – Your Universe Online

It’s not what you do, it’s the way that you do it. This is the major finding of a new study investigating the link between stress and the development of insomnia.

“Our study is among the first to show that it’s not the number of stressors, but your reaction to them that determines the likelihood of experiencing insomnia,” stated lead author Vivek Pillai, PhD and research fellow at the Sleep Disorders and Research Center at Henry Ford Hospital in Detroit, Michigan. “While a stressful event can lead to a bad night of sleep, it’s what you do in response to stress that can be the difference between a few bad nights sleep and chronic insomnia.”

The study results, published in this month’s issue of the journal Sleep, show that coping with a stressful event by opting to bury one’s head in the sand or engaging in alcohol or drug use could be a major factor in developing insomnia due to stress. Just as detrimental to a good night’s sleep is the act of cognitive intrusion, where recurrent thoughts about the stressor continually pop into your mind, which can account for 69 percent of the total effect of stress exposure on insomnia.

The researchers availed themselves of a community-based sample of 2,892 good sleepers who presented no lifetime history of insomnia. Each participant provided examples of stressors experienced in the past year, such as divorce, death, financial issues or major illness to provide the study with a baseline. Participants also rated the severity and duration of each event.

In the initial questionnaire, measurements of cognitive intrusion and coping strategies employed by the participants were noted. After one year, the study was able to identify participants who had developed an insomnia disorder, defined as having symptoms of insomnia occurring at least three nights per week for a duration of one month or longer. Insomnia is also responsible for individuals experiencing daytime impairment or distress.

“This study is an important reminder that stressful events and other major life changes often cause insomnia,” said American Academy of Sleep Medicine President Dr. Timothy Morgenthaler. “If you are feeling overwhelmed by events in your life, talk to you doctor about strategies to reduce your stress level and improve your sleep.”

Most importantly, the authors of the study claim their research has identified potential targets for therapeutic interventions to educate and improve on coping responses to stress that could reduce the overall risk of insomnia. Mindfulness-based therapies, in particular, are effective in combating cognitive intrusion and improving sleep.

“Though we may not be able to control external events, we can reduce their burden by staying away from certain maladaptive behaviors,” said Pillai.

Approximately 15 to 20 percent of the adult population will experience short-term insomnia disorder, according to The American Academy of Sleep Medicine. Short-term insomnia disorder is characterized by a time span of less than three months. It also is more prevalent among women than with men.

Supported by funding from the National Institute of Mental Health and the National Institutes of Health, this research was performed under the supervision of Thomas Roth, PhD, and Christopher Drake, PhD, both of the Sleep & Research Center at Henry Ford Hospital in Detroit.

—–

Monitor how long and how well you sleep: Fitbit Flex Wireless Activity + Sleep Wristband, Black

Future Of The Internet Worrisome For Some Web Experts

Peter Suciu for redOrbit.com – Your Universe Online
The open Internet could be challenged by nation-states that look to maintain security and political control, and in the next decade this could lead to more blocking, filtering, segmentation and even balkanization of the Internet. Trust could evaporate in the wake of revelations about government and corporate surveillance with fears of even greater surveillance in the future.
These are two of the more worrisome concerns for the future of the Internet, according to a new study conducted by Pew Internet, which canvassed Internet experts about the future of the web. More than 1,400 Internet experts responded to questions on issues about accessing and sharing content online. The majority of respondents to the 2014 Future of the Internet study said that they had hopes that by 2025 there would not be significant changes for the worse or hindrances to the way in which people get and share content online.
Additionally, the respondents in this study offered the opinion that technology innovation will continue and moreover could continue to afford new opportunities for people to connect. However, as noted there remains the threat that nation-states could present challenges via crackdowns, while surveillance will also be a growing concern.
“Governments worldwide are looking for more power over the Net, especially within their own countries,” said Dave Burstein, editor of Fast Net News. “Britain, for example, has just determined that ISPs block sites the government considers ‘terrorist’ or otherwise dangerous. This will grow. There will usually be ways to circumvent the obstruction but most people won’t bother.”
The respondents also noted that Arab Spring remains an example of how the Internet could be used to organize political dissent while it also prompted crackdowns by the government.
In addition to government control, the other worrisome trends included the increasing commercial pressures affecting Internet architecture to the flow of information, which could in fact endanger the open structure of online life; and that efforts to fit the TMI (too much information) program there might be overcompensation that actually thwarts content sharing.
“Sharing is hindered by ridiculous 19th century laws about copyright and patent,” added Marcel Bullinga, technology futures speaker, trend watcher, and futurist. “Both will die away. That will spur innovation into the extreme. That is the real singularity.”
Mobile devices could also change the way information is controlled, said Jonathan Grudin, principal researcher at Microsoft Research.
“Today, people in some countries are hindered from accessing online information, but smaller mobile devices have made it more difficult to censor,” said Grudin. “I am guardedly optimistic that information providers and consumers will continue to elude government censorship. Information does seem to want to be free, and technology has made that easier on balance. I do not see a potent threat looming, and the commercial interest in disseminating information should not be underestimated.”
This study was conducted by the Pew Research Center Internet Project and Elon University’s Imagining the Internet Center in an online canvassing conducted between November 25, 2013 and January 13, 2014. This is the sixth Internet study the two organizations have conducted together since 2004, and in this project more than 12,000 experts and members of the interested public were invited to share their opinions on the likely future of the Internet.
—–
PROTECT YOURSELF TODAY – Norton Antivirus