Google Announces Project Tango To Give Mobile Devices A Human-Scale Understanding Of Space And Motion
February 22, 2014

Google Dances The Tango With New Project

Peter Suciu for - Your Universe Online

On Thursday Google announced Project Tango, an Android-based prototype mobile phone and developer kit that includes advanced 3D sensors. This latest project is being developed by its Advanced Technology and Projects (ATAP) hardware group, which was recently moved over from Motorola.

The project has already drawn comparisons to Microsoft’s Xbox Kinect, which can track motion, but in this case the search giant is looking to integrate this technology into a mobile device.

“The goal of Project Tango is to give mobile devices a human-scale understanding of space and motion,” Johnny Lee, the technical program lead at ATAP, posted on the official Project Tango website. “As we walk through our daily lives, we use visual cues to navigate and understand the world around us. We observe the size and shape of objects and rooms, and we learn their position and layout almost effortlessly over time. This awareness of space and motion is fundamental to the way we interact with our environment and each other. We are physical beings that live in a 3D world. Yet, our mobile devices assume that physical world ends at the boundaries of the screen.”

Lee added that Google has spent the past year working with universities, research labs, and industrial partners – spanning some nine countries around the world – in an effort to “harvest research from the last decade of work in robotics and computer vision, concentrating that technology into a unique mobile phone.”

The goal of the project is to give mobile devices a more “human-like understanding of space and motion,” and this is being accomplished through the integration of advanced sensor fusion and computer vision. From this the user experience could be further enhanced via 3D scanning, indoor navigation and even through more immersing gaming.

The computer vision part of the handset is being enabled by a new co-processor that was developed by Google partner Movidius. This “Myriad 1” chip was reportedly designed from scratch to bring Kinect-like computer vision to the smartphone. Lee is a former Microsoft employee and worked on developing the Kinect technology so he has experience in this arena reports Ars Technica.

The Project Tango prototype runs on Google’s Android – no surprise there – but also provides development APIs to help Android apps that are built on Java, C/C++ and even Unity Game Engine to “learn data about the phone’s position, orientation and depth," ReadWriteWeb reports.

It now appears that Google is ready to put those early prototypes of the device into the hands of developers, which in turn “can imagine the possibilities and help bring those ideas into reality.” Lee noted that Google hopes to distribute all of the available units by March 14.

According to online reports Google is only offering 200 of the 5-inch prototypes, which also include a depth sensor, and two cameras – the second of which is for motion tacking.

At present Google has 16 partners, which are recognized on the Project Tango website and include Bosch, Bsquare, CompalComm, ETH Zurich, Flyby, hiDof, MM Solutions, Movidius, NASA, Ologic, the Open Source Robotics Foundation, Parascosm and Sunny Optical Technology, along with George Washington University and the University of Minnesota.