Quantcast

Monkeys Control Virtual Reality Avatar With Only Their Brains

October 6, 2011

Two monkeys trained in a Duke University laboratory were able to control a monkey on a computer screen and distinguish between different textures of virtual objects using only their brains. The results of this research were published in the October 5th edition of the journal Nature.

Miguel Nicolelis, MD, PhD, professor of neurobiology at Duke University Medical Center and the senior author of the study said in the press release, “Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton.”

The monkeys were able to differentiate among different virtual textures, even though the virtual objects looked similar. The monkeys did not use any physical part of their bodies, the virtual objects were manipulated only by their brain waves. The texture of the virtual objects was expressed as a pattern of minute electrical signals transmitted to the monkeys´ brains. Each pattern had three different electrical patterns.

Nicolelis said, “This is also the first time we´ve observed a brain controlling a virtual arm that explores objects while the brain simultaneously receives electrical feedback signals that describe the fine texture of objects ℠touched´ by the monkey´s newly acquired virtual hand.”

Researchers feel that the brain-machine-brain interface (BMBI) could be used to help human patients who are paralyzed due to spinal cord injuries.

“The remarkable success with non-human primates is what makes us believe that humans could accomplish the same task much more easily in the near future,” Nicolelis said in the press release.

This technology has the potential to allow paralyzed patients to wear a robotic exoskeleton while they enjoy more freedom in their daily lives and receive feedback from their surroundings. The researchers predict that sensors could be distributed across the exoskeleton generating the electrical feedback to the brain to identify the texture, shape and temperature of objects, as well as the texture of the surface they are walking on.

This could be the groundbreaking technology that the Walk Again Project needs for their goal of restoring full body mobility to quadriplegic patients. They plan on introducing their first public demonstration of an autonomous exoskeleton during the opening game of the 2014 FIFA Soccer World Cup that will be held in Brazil.

On the Net:


Source: RedOrbit Staff & Wire Reports



comments powered by Disqus