Robot Aircraft Teach Themselves Which Way Is Up
December 9, 2011

Robot Aircraft Teach Themselves Which Way Is Up

[ Video 1 ] | [ Video 2 ]

Australian vision scientists today unveiled a novel way to help pilotless aircraft accurately determine their heading and orientation to the ground - by imitating how insects do it.

The technology can improve the navigation, flight characteristics and safety for civil and military aircraft, as well as pilotless drones says Mr Richard Moore, a researcher at The Vision Centre and The Queensland Brain Institute at the University of Queensland.

“UAVs (unmanned aerial vehicles or pilotless aircraft) are used in crop dusting, bushfire monitoring, tracking algal blooms or crop growth and infrastructure inspection as well as defense roles,” he says. “Some of these tasks require the aircraft to fly close to the ground and amongst obstacles, so it is crucial that the aircraft knows its heading direction and roll and pitch angles accurately.”

While there are other sensors such as magnetometers, gyroscopes, and accelerometers that can help the aircraft determine its heading and orientation, they suffer from problems such as noise-induced drift, and can be adversely affected by the motions of the aircraft or materials in the environment surrounding the sensors, Mr Moore explains.

“This means that UAVs can´t perform significant maneuvers without losing their sense of direction for a while.”

To provide real-time guidance for UAVs, the researchers have designed a vision based system that provides aircraft with the same advantage that insects have — a fixed image of the sky and the horizon.

“If you watch a flying insect, you will see that their heads are upright when they turn their bodies,” Mr Moore says. “Keeping their heads still allows them to have a stabilized image of the horizon and the sky, which is crucial in determining their heading.”

In the new system, the aircraft uses two back-to-back fisheye lenses to capture a wide-angle view of the environment. It then divides the image into the sky and ground regions using information such as the brightness or color combinations. The orientation of the sky and ground regions allows the aircraft to determine its roll and pitch angles with respect to the horizon.

“Using its estimated orientation, the aircraft can then generate a panoramic image of the horizon, and use it as a reference,” Mr Moore explains. “The aircraft can then determine its heading direction continuously throughout the flight by producing an instantaneous horizon panorama and comparing it with the reference image.”

Although a similar vision-based approach has been proposed previously, this new system improves visual performance by enabling the aircraft to learn directions by itself.

“This system doesn´t need any programming before take-off, unlike earlier ones that required lots of offline training: researchers had to manually compute the differences between the sky and the ground, then feed it into the system.

“With the new system, we only have to tell the aircraft that it´s in the upright position when it starts flying. It will then use that as a starting point to work out which is sky and which is ground, and train itself to recognize the differences.

“This is important because if the aircraft relies solely on the prior training, it will be in trouble once it´s in an unfamiliar environment. The self-learning ability allows the system to keep a record of what it ℠sees´, update its reference base continuously and be adaptive.”

The group performed a closed-loop flight test with the new system where the aircraft was commanded to perform a series of 90 degree turns for four minutes.

“The tests indicate that the aircraft can estimate its heading much more accurately with a visual compass, compared to other navigation systems like magnetic compasses and gyroscopes,” says Mr Moore.

“The ability to estimate the precise roll and pitch angles and the heading direction instantaneously is crucial for UAVs, as small errors can lead to misalignments and crashes.”

Mr Moore will presented the paper “A method for the visual estimation and control of 3-DOF Attitude for UAVs” today at the Australasian Conference on Robotics and Automation 2011.

The Vision Centre is funded by the Australian Research Council as the ARC Centre of Excellence in Vision Science.


On the Net: