Quantcast

New Humanoid Robot Mimics Precise Facial Expressions

November 12, 2008

Scientists at the Bristol Robotics Laboratory (BRL) have devised the first humanoid robot that can perform the precise lip movements and facial expressions of human beings.

Named “ËœJules’, the robot can automatically copy human movements by capturing them with its video camera “Ëœeyes’ and mapping them onto small electronic motors in its skin.  The process allows Jules’ disembodied androgynous robotic head to grin and grimace, furrow its brow and ‘speak’.

Jules mimics these expressions by converting the video images into digital commands, which allow the robot’s servos and motors to produce copies of these movements in real time.  Indeed, the robot can interpret  commands at 25 frames per second.

The University of the West of England and the University of Bristol ran the BRL project, dubbed “ËœHuman-Robot Interaction’.

Robotics engineers Chris Melhuish, Neill Campbell and Peter Jaeckel worked three-and-a-half years developing the groundbreaking software that allows the interaction between humans and artificial intelligence.  Jules has 34 internal motors covered with flexible rubber (‘Frubber’) skin, which was commissioned from U.S. roboticist David Hanson for the project.

Jules was initially programmed to act out a series of movements, using technology that implements ten stock human emotions, such as happiness, sadness and worry, that the team ‘taught’ Jules through their programming.

The software then maps the images to Jules’s face to combine expressions that instantly mimic those being shown by a human subject.

“We have a repertoire of behaviors that somehow is dynamic”, Chris Melhuish told Britain’s Daily Mail.

“If you want people to be able to interact with machines, then you’ve got to be able to do it naturally. “

‘When it moves, it has to look natural in the same way that human expressions are, to make interaction useful.”

‘Realistic, life-like robot appearance is crucial for sophisticated face-to-face robot-human interaction,” Peter Jaeckel, a BRL scientist who works in artificial emotion, artificial empathy and humanoids, told the Daily Mail.

“Researchers predict that one day, robotic companions will work, or assist humans in space, care and education.”

“Robot appearance and behavior need to be well matched to meet expectations formed by our social experience,” he said.

‘Violation of these expectations due to subtle imperfections or imbalance between appearance and behavior results in discomfort in humans that perceive or observe the robot.”

‘If people were put off, it would counteract all efforts to achieve trustworthiness, reliability and emotional intelligence. All these are requirements for robotic companions, assisting astronauts in space or care robots employed as social companions for the elderly.”

‘Unlike most research projects, the focus lies on dynamic, subtle, facial expressions, rather than static exaggerated facial displays.

‘Copycat robot heads have been created before, but never with realistic human-looking faces.’

However, some scientists are unimpressed by Jules’ capabilities.

Kerstin Dautenhahn, a robotics researcher at the University of Herefordshire, believes that people may be troubled by humanoid automatons that may seem ‘too human’.

“Research has shown that if you have a robot that has many human-like features, then people might actually react negatively towards it,” she told the Daily Mail.

‘If you expose vulnerable people, like children or elderly people, to something that they might mistake for human, then you would automatically encourage a social relationship.”

“They might easily be fooled to think that this robot not only looks like a human and behaves like a human, but that it can also feel like a human. And that’s not true.”

Many are optimistic that the technology developed for Jules will find further applications, such as robots that could accompany solo missions into space, or in healthcare environments.

On the Net:




comments powered by Disqus