Researchers Create Mind-Controlled Flying Robot
April Flowers for redOrbit.com — Your Universe Online
A new, non-invasive system that allows people to control a flying robot using only their mind has been developed by researchers at the University of Minnesota, according to a study published in the Journal of Neural Engineering. Although it sounds like fun and games, the study´s findings have implications for helping people who are paralyzed or have neurodegenerative diseases.
Three female and two male study participants were able to successfully control the four-blade flying robot. The five subjects quickly and accurately controlled the quadcopter for a sustained amount of time.
“Our study shows that for the first time, humans are able to control the flight of flying robots using just their thoughts sensed from a noninvasive skull cap,” said Bin He, a professor of biomedical engineering in the University of Minnesota´s College of Science and Engineering. “It works as good as invasive techniques used in the past.”
The research is intended to help people with paralysis or neurodegenerative diseases regain mobility and independence, according to He. “We envision that they´ll use this technology to control wheelchairs, artificial limbs or other devices,” He said.
The team´s noninvasive technique is called electroencephalography (EEG). EEG is a unique brain-computer interface that records the electrical activity of the subjects´ brain using a specialized, high-tech EEG cap fitted with 64 electrodes.
“It´s completely noninvasive. Nobody has to have a chip implanted in their brain to pick up the neuronal activity,” said Karl LaFleur, a senior biomedical engineering student, during the study.
According to the team, the brain-computer interface system works because of the geography of the motor cortex, which is the area of the cerebrum that governs movement. As we move, or think about movement, tiny electrical currents are created by neurons in the motor cortex. A new assortment of neurons is activated when we think about a different movement. The researchers built the groundwork for the brain-computer interface by sorting out these assortments. The new study builds on He´s prior research where subjects were able to control a virtual helicopter on a computer screen.
“We were the first to use both functional MRI and EEG imaging to map where in the brain neurons are activated when you imagine movements,” He said. “So now we know where the signals will come from.”
The subjects involved in the current study faced away from the quadcoptor. They were asked to imagine using their right hand, left hand, and both hands together to control the robot´s movements — turning right, left, lifting and falling. A pre-set forward motion drove the quadcoptor, while only the subject´s thoughts controlled other movements.
While they were positioned in front of a screen that relayed images of the robot´s flight through an onboard camera, the subjects´ brain signals were recorded by the EEG cap and relayed to the quadcoptor over Wi-Fi.
The participants were required to fly the robot through two large rings suspended from a gymnasium ceiling, after several training sessions. How well each performed was assessed by a number of statistical tests. A control group also directed the quadcoptor with a keyboard, allowing for a comparison between standardized methods and brain control.
“Our next step is to use the mapping and engineering technology we´ve developed to help disabled patients interact with the world,” He said. “It may even help patients with conditions like autism or Alzheimer´s disease or help stroke victims recover. We´re now studying some stroke patients to see if it´ll help rewire brain circuits to bypass damaged areas.”