Quantcast

Robot Accepts Gift From Human Hand With Help Of Motion Capture

May 20, 2013
Image Credit: Thinkstock.com

Lee Rannals for redOrbit.com — Your Universe Online

Scientists at Disney Research and the Interactive Systems Lab of Carnegie Mellon University (CMU) and Karlsruhe Institute of Technology (KIT) say they’ve taken human-to-robot hand-off motion to the next level.

Researchers have had a difficult time developing a robot that is able to recognize when a person is handing them something and predicting where to make the hand-off. However, the team from Disney and KIT will be presenting their latest achievements in this area of robotics at the IEEE International Conference on Robotics and Automation in Karlsruhe, Germany.

The scientists say they were able to solve the problem by using motion capture data with two people to create a database of human motion. By searching the database, a robot is able to realize what the human is doing and make a reasonable estimate of where he is likely to extend his hand.

Katsu Yamane, a senior research scientists with Disney Research, Pittsburgh, said people handing a coat, a package or a tool to a robot will become commonplace if robots are introduced to the workplace and the home. Yamane and his team’s technique could be applied to a number of situations where a robot needs to synchronize its motion with that of a human.

“If a robot just sticks out its hand blindly, or uses motions that look more robotic than human, a person might feel uneasy working with that robot or might question whether it is up to the task,” Yamane explained in a statement. “We assume human-like motions are more user-friendly because they are familiar.”

The team had to develop a hierarchal data structure in order to help the robot access the library of human-to-human passing motions with the right speed. They first developed a rough estimate of the distribution of various motion samples. Afterwards, they grouped samples of similar positions and organized them into a binary-tree structure. This allows the robot to rapidly search the database so it can recognize when the person initiates a handing motion and then refine its response as the person follows through.

The researchers tested their method with computer simulations and confirmed that the robot began moving its hand before the human’s hand reached his desired passing location. They saw that the robot’s hand position roughly matched that of the human receivers from the database it was attempting to mimic.

Yamane said the team’s work is not finished, and now they need to expand the database for a wider variety of passing motions and passing distances. They also hope to eventually add finger motions and secondary behaviors that would make the robot’s motion more engaging.

A team from DARPA announced earlier this month that they have created an extremely advanced three-fingered robot hand. The hand is capable of grasping up to 50 pounds and turning a key, making it one of the most agile robotic hands to come out.


Source: Lee Rannals for redOrbit.com – Your Universe Online



comments powered by Disqus