Eric Hopton for redOrbit.com – Your Universe Online
We may not have to wait as long as we thought to see those much vaunted delivery drones Googling their way down our streets – all because we can now see them thinking. The deployment of autonomous vehicles, flying cars, and even fire-fighting drones in real-life situations, may happen much sooner than was previously possible thanks to breakthrough work by researchers at MIT. The scientists have developed a way to view and understand just what happens as a robot tries to make decisions.
Understandably, official bodies like the Federal Aviation Administration (FAA) have restricted real-world testing of robots such as those autonomous vehicles, quadrotors and drones. A bad decision, even by robots, can end up being a bad accident and, after struggling to explain to visitors and observers at MIT how multiple robots interact and make decisions, the researchers realized they had to take an entirely new approach if they were ever to convince the FAA and others that the robots were safe.
If they couldn’t take the robots outside to test, why not bring the world inside, thought the MIT team. In the dimly lit hangar-like Building 41 down at MIT, Roomba-like robots are being put through their paces to test the new system known as MVR, or “measurable virtual reality.” MVR is “a spin on conventional virtual reality that’s designed to visualize a robot’s “perceptions and understanding of the world” and is the brainchild of Ali-akbar Agha-mohammadi, a postdoc in MIT’s Aerospace Controls Lab and Shayegan Omidshafiei, a graduate student. The MIT pair and their colleagues, including Jonathan How, professor of aeronautics and astronautics, will present details of the visualization system at the American Institute of Aeronautics and Astronautics’ SciTech conference in January. MIT’s work is supported by Boeing.
In one Building 41 simulation, a robot demonstrates how MVR works. Its task is to get to the other side of the room, but to do that it has to avoid an obstacle in the shape of a human pedestrian moving around in its path. Thanks to MVR, the robot’s decision making process can be visualized as its “thoughts” are projected on the ground. As the pedestrian moves, it is tracked by a large pink dot on the ground. The dot represents the robot’s perception of the pedestrian’s spatial position. Meanwhile, several different colored lines radiate across the room, each signifying one possible route for the robot. A green line represents the robot’s idea of what it sees as the optimal route, avoiding collision with the pedestrian.
This new visualization system combines ceiling-mounted projectors with motion-capture technology and animation software to project a robot’s intentions in real time. We really can see the robot thinking – in color too!
According to Agha-mohammadi, seeing a robot’s decision making process will help fix faulty algorithms much faster. “For example,” he says, “if we fly a quadrotor, and see something go wrong in its mind, we can terminate the code before it hits the wall, or breaks.”
His colleague Shayegan Omidshafiei adds, “Traditionally, physical and simulation systems were disjointed. You would have to go to the lowest level of your code, break it down, and try to figure out where the issues were coming from. Now we have the capability to show low-level information in a physical manner, so you don’t have to go deep into your code, or restructure your vision of how your algorithm works. You could see applications where you might cut down a whole month of work into a few days.”
There are many potential applications for this technology, and in one study the scientists have been testing drones they hope can be used in fighting forest fires. In order to work in real life, the drones will need to observe and understand a fire’s effect on a range of different vegetation. They will then need to pick out the areas of fire most likely to spread and which to put out first.
For this test, the researchers projected landscapes to simulate an outdoor environment on the floor of the hangar. They then flew “physical quadrotors over projections of forests, shown from an aerial perspective to simulate a drone’s view, as if it were flying over treetops.” Images of fire were projected on various parts of the landscape and the quadrotors were instructed to take images of the terrain which may eventually be used to “teach” the robots how to recognize signs of particularly dangerous fires.
The scientists now plan to use many more simulated environments, including using the MVR system to test drone performance in package-delivery scenarios by simulating urban environments and creating street-view projections of cities.
This kind of faster prototyping and more realistic testing in simulated environments should speed up development and regulatory approval.
> The History Of Robotics – Robotics Reference Library
> MIT-Developed Submersible Robot Could Help Foil Smugglers
> MIT’s Cheetah “Bound For Robotic Glory”