Google Wants You To Hear Through Your Bones Instead Of Your Ears
Michael Harper for redOrbit.com – Your Universe Online
Though the project remains “in flux,” more possible details about Google’s augmented reality/smart glasses have emerged by way of the United States Patent and Trademark Office (USPTO).
According to a recent filing, Google could plan to use “indirect bone conduction” to deliver sound from the glasses to the user’s ear. Rather than depend on a set of earbuds or other speakers, this new patent application details a system which sends vibrations directly to the skull which are picked up and perceived as sound.
As it’s been correctly pointed out by numerous news sources, this technology is hardly new, though it could be well applied in the Google Glass project, should it ever make it to the final product.
With bone conduction sound, users would be able to hear the glasses speak to them without plugging their ears, thereby allowing them to hear more of their surrounding environment. This is particularly helpful, given that these glasses will no doubt be primarily worn in urban environments, especially in their early years. As an added bonus, bone conduction sound also minimizes sound leakage, meaning those nearby won’t have to hear all the dubstep the wearer will inevitably be listening to, assuming of course that dubstep will still be a thing by then.
The newly released Google application (filed in October 2011) details a system where by sound is transmitted through the bone at a number of contact points. These points include the basic areas already covered or touched by spectacles, such as behind the ear, the bridge of the nose and the temple.
“For example,” reads the filing, “an exemplary head-mounted display may include at least one vibration transducer that is configured to vibrate at least a portion of the head-mounted display based on the audio signal.”
The patent abstract goes on: “The vibration transducer is configured such that when the head-mounted display is worn, the vibration transducer vibrates the head-mounted display without directly vibrating a wearer. However, the head-mounted display structure vibrationally couples to a bone structure of the wearer, such that vibrations from the vibration transducer may be indirectly transferred to the wearer’s bone structure.”
Google held a pair of hackathons in New York and San Francisco last week and over the weekend in order to have developers come up with some great ideas and applications for these futuristic specs.
These attendees were also previously able to buy an advanced set of these glasses to work with last summer at the Google I/O conference, where they also threw the glasses – albeit strapped to a skydiver – from a plane not once, but twice.
Though some of these developers are only now heading home from their Glass Foundry experience, they’ve also signed that most frustrating of documents, the Non-Disclosure Agreement (NDA), meaning they won’t be talking about any developments in the project for the foreseeable future. According to Android Community, these developers have even been given a dedicated Gmail address for information, news and communication related to the Glass project.
Just as it’s entirely possible that Google showed off bone conduction technology last week to their Glass developers, it’s also likely the tech may never make it to the project. Not long ago, Google Glass head Babak Parvis gave an interview, saying many parts of the project remain “in flux.”
Even the Glasses’ touted augmented reality may not make it to version one, according to Parvis. At this point, it seems a toss up for any feature to be potentially added to the glasses.