Robot Being Taught Language Through Artificial Neuronal Network
February 20, 2013

Simplified Artificial Brain Allows iCub Robot To Learn Language

[ Watch the Video: Simplified Brain Allows iCub Robot To Learn Language ]

Peter Suciu for — Your Universe Online

Robots aren´t actually “taught” but rather are programmed. There have been attempts to create a system that can allow robots to “learn” but it is still a matter of essentially programming the new skills. However, a new artificial brain system could enable robots to learn language, which could very well be the key to teaching robots to then do other tasks.

INSERM and CNRS researchers and the Université Lyon 1 have reportedly succeeded in developing an “artificial neuronal network.” This was constructed around the very basis of a fundamental principle of the workings of the human brain, and focused on its ability to learn a language.

In this instance the model was developed following years of research in the INSERM 846 Unit of the Institut de recherche sur les cellules souches et cerveau, and followed the study of the actual structure of the human brain. This research has been published in the PLoS ONE journal.

This artificial brain system allows a robot to understand new sentences containing new grammatical structure, and can further link sentences together, or predict how a sentence will end before it ends.

The INSERM researchers put this into a real-life situation where the new brain was put into an iCub humanoid robot, which was developed at IIT as part of a European Union project called RobotCub. It has been adopted by more than 20 laboratories worldwide.

The iCub features a complex body with 53 motors that can move the head, arms and hands, waist and legs. The iCub is capable of sight and hearing, and has a sense of proprioception, or body configuration, along with movement via accelerometers and gyroscopes. The developers of the iCub are reportedly working on methods to give it a sense of touch.

It has previously “learned” to balance on two legs, and the next step could be learning to understand language. In tests with INSERM, researchers asked the iCub to point to a guitar, shown in the form of a blue object; and then asked the robot to point to a violin, shown as a red object. Prior to each task the robot was required to repeat the sentence and explain that it had fully understood the task it was asked to accomplish.

This is still a long way from being able to have a conversation with a robot however. The reason is that the human brain is capable of language-processing at the speed at which it is performed.

In other words the human brain can process the first words of a sentence in real time and even anticipate what will follow; and also in real time the brain can continually revise its prediction through interaction between new information and that previously created. This is accomplished inside the frontal cortex and the striatum, which plays a crucial role in accomplishing this process.

Getting a machine to react the same way is the catching part, and based on this understanding, Dr. Peter Ford Dominey of the Robot Cognition Laboratory at INSERM and CNRS research director, and his team have developed an “artificial brain” that utilizes a “neuronal construction” that is reportedly similar to that used by the human brain.

This artificial brain system utilizes recurrent loops to help the system understand new systems having a grammatical structure, and it is even capable of linking two sentences together or predicting the end of a sentence before it is provided.

This could be important as this system could be used to help understand the way the human brain processes language.

“We know that when an unexpected word occurs in a sentence, the brain reacts in a particular way,” said Dr. Dominey. “These reactions could hitherto be recorded by sensors placed on the scalp.”

The research could make it possible not only to carry on a conversation with a robot but to identify the source of the responses within a brain. It could help determine how linguistic malfunctions in Parkinson´s disease may occur.

It could also make programming of robots easier, in turn allowing the robots to do more.

“At present, engineers are simply unable to program all of the knowledge required in a robot,” said Dominey. “We now know that the way in which robots acquire their knowledge of the world could be partially achieved through a learning process — in the same way as children.”