Robotic Firefighter Creates 3D Map While Searching For Survivors
redOrbit Staff & Wire Reports – Your Universe Online
Researchers at University of California, San Diego (UCSD), have developed new image processing techniques that allow small Segway-like robotic vehicles to create 3D thermal images of fires to assist rescuers.
The mobile robots are equipped with a state-of-the-art on-board software system that takes the thermal data recorded by its small infrared camera and maps it onto a 3D scene constructed from the images taken by a pair of stereo RGB cameras.
This allows the robotic vehicles to create a virtual picture that includes a 3D map and temperature data that can be immediately used by firefighters and other first responders as the robot makes its way through a burning structure.
The research is part of an initiative to develop new robotic scouts that can assist firefighters in residential and commercial blazes.
The robotic vehicles will map and photograph the interior of burning buildings using stereo vision, and will use data gathered from various sensors to characterize the state of a fire, such as temperature, presence of volatile gasses and structural integrity, all while searching for survivors at the same time.
A number of these robotic vehicles working together would quickly develop an accurate, real-time virtual picture of the building´s interior for rescuers, who would then use the data to better assess their plan for firefighting and rescue activities.
Thomas Bewley, a professor of mechanical engineering at the Jacobs School of Engineering at UC San Diego, and his dynamics and control team have already built the first prototype of the robotic vehicle, which resembles a self-righting Segway that can climb stairs.
“These robot scouts will be small, inexpensive, agile, and autonomous,” Bewley said.
“Firefighters arriving at the scene of a fire have 1000 things to do. To be useful, the robotic scouts need to work like well-trained hunting dogs, dispatching quickly and working together to achieve complex goals while making all necessary low-level decisions themselves along the way to get the job done.”
The researchers will present their work at the International Conference on Robotics and Automation next year in Hong Kong.