Quantcast

Controlling Robots With Our Minds And Other Methods

May 3, 2013
Image Credit: Photos.com

Lee Rannals for redOrbit.com — Your Universe Online

The future of robotics may not be as scary as one would think, especially if we are able to control robots with our minds like one scientist is trying to do.

Researcher Angel Perez Garcia says he is able to make a robot move exactly how he wants using his thoughts. For the experiment, he says he hooks up an EEG to himself to monitor electrical activity, then focuses on a symbol, such as a flashing light. The electrodes read the activity in the brain and interprets the signals to send a message to the robot to make it move in a pre-defined way.

“I can focus on a selection of lights on the screen. The robot´s movements depend on which light I select and the type of activity generated in my brain,” says Angel. “The idea of controlling a robot simply by using our thoughts (EEG brainwave activity), is fascinating and futuristic.”

He said he uses the movements of his eyes, eyebrows and other parts of his face to select which of the robot’s joints he wants to move.

Other researchers at Norwegian University of Science and Technology (NTNU) are building robots of the future. Ingrid Schjølberg, a supervisor and researcher, is using a new training program to help her three-fingered robot grasp objects in new ways.

“Well, everyone knows about industrial robots used on production lines to pick up and assemble parts,” says Schjølberg. “They are pre-programmed and relatively inflexible, and carry out repeated and identical movements of specialized graspers adapted to the parts in question.”

She said we can see how industries may encounter major problems every time a new part is brought in and has to be handled on the production line.

“The replacement of graspers and the robot’s guidance program is a complex process, and we want to make this simpler. We want to be able to program robots more intuitively and not just in the traditional way using a panel with buttons pressed by an operator,” Schjølberg added.

Student Signe Moe is working on guiding a robot just by moving her arms. She is determined to find how a robot can be trained to imitate human movements.

“Now it’s possible for anyone to guide the robot,” says Moe. “Not long ago some 6th grade pupils visited us here at the robotics hall. They were all used to playing video games, so they had no problems in guiding the robot.”

Moe stands a few feet in front of a camera so it can register her and trace the movements of her hand.

“Now, when I move my hand up and to the right, you can see that the robot imitates my movements”, she says. “The Kinect camera has built-in algorithms which can trace the movements of my hand. All we have to do is to transpose these data to define the position we want the robot to assume, and set up a communications system between the sensors in the camera and the robot.”

The scientists at NTNU may be shaping the future of robotics, but what one question researchers set out to answer is how accepting will humans be of their new robot companions. The team recently said at the 63rd Annual International Communication Association (ICA) conference in London that they found humans actually experience empathy for robots.

“One goal of current robotics research is to develop robotic companions that establish a long-term relationship with a human user, because robot companions can be useful and beneficial tools. They could assist elderly people in daily tasks and enable them to live longer autonomously in their homes, help disabled people in their environments, or keep patients engaged during the rehabilitation process,” said Astrid Rosenthal-von der Pütten of the University of Duisburg-Essen.


Source: Lee Rannals for redOrbit.com – Your Universe Online



comments powered by Disqus