October 24, 2013
NSF, NIH, USDA And NASA Fund Development Of Robots That Collaborate With Humans For Enhanced Productivity
National Science Foundation
The National Science Foundation (NSF), in partnership with the National Institutes of Health (NIH), U.S. Department of Agriculture (USDA) and NASA, today announced new investments totaling approximately $38 million for the development and use of robots that cooperatively work with people to enhance individual human capabilities, performance and safety.
These mark the second round of funding awards made through the National Robotics Initiative (NRI) launched with NSF as the lead federal agency just over two years ago as part of President Obama's Advanced Manufacturing Partnership Initiative.
"NSF is proud to work with other government agencies to fund research that furthers technological advances in robotics," said NSF Acting Director Cora Marrett. "Co-robots work alongside humans and make Americans more effective and efficient in many vital areas related to safety, productivity and health. This research continually expands what robots can do to enhance human capabilities."
Funded projects target the creation of next-generation collaborative robots, or co-robots, for advanced manufacturing; civil and environmental infrastructure; health care and rehabilitation; military and homeland security; space and undersea exploration; food production, processing and distribution; independence and quality of life improvement and driver safety.
NSF funded 30 new projects, an investment of approximately $31 million during the next three years to advance the science of robotics across multiple sectors. This year's projects include research to improve robotic motion--advancing bipedal movement, dexterity and manipulation of robots and prostheses--and robotic sensing--advancing theories, models and algorithms to share and analyze data for robots to perform collective behaviors with humans and with other robots.
The projects also aim to enhance 3-D printing, develop co-robot mediators, improve the training of robots, advance the capabilities of surgical robotics and provide assistive robots for people with disabilities. In addition, the projects will improve the capability of robots for lifting and transporting heavy objects and for dangerous and complex tasks like search and rescue during disaster response.
A few of the projects are highlighted below:
Matthias Scheutz, Linda Tickle-Degnen, Tufts University; Ronald Arkin, Georgia Institute of Technology
This research will assist people with Parkinson's Disease (PD). Those afflicted with PD often experience facial masking, a reduced ability to signal emotion, pain, personality and intentions to caregivers and health care providers who often misinterpret the lack of emotional expressions as disinterest and an inability to adhere to treatment regimen, resulting in stigmatization. This project will develop a robotic architecture endowed with moral emotional control mechanisms, abstract moral reasoning and a theory of mind to allow co-robots to be sensitive to human affective and ethical demands. The long-term goal of this work is to develop co-robot mediators for people with facial masking due to PD.
Sonia Chernova, Worcester Polytechnic Institute; Andrea Thomaz, Georgia Institute of Technology
This work seeks to leverage cloud computing to enable robots to efficiently learn from remote human domain experts. This project builds on RobotsFor.Me, a remote robotics research lab and will unite learning from demonstration and cloud robotics to enable anyone with Internet access to teach a robot household tasks.
Gaurav Sukhatme, University of Southern California
Combining scientists' specialized knowledge and experience with the efficiency of autonomous systems capable of processing and evaluating large quantities of data is a powerful method for scientific discovery. This research leverages these complementary strengths to develop a collaborative system capable of guiding scientific exploration and data collection by integrating input from scientists into an autonomous learning and planning framework. The project team is validating the approach in the challenging domain of autonomous underwater ocean monitoring, particularly well suited for the testing of human-robot collaboration due to the limited communication available under water and the necessary supervised capabilities.
On The Net: