Quantcast
Last updated on April 23, 2014 at 21:24 EDT

Disney Research Develops Tactile Touch Displays

October 7, 2013
Image Credit: Disney Research

Michael Harper for redOrbit.com – Your Universe Online

Disney researchers in Pittsburgh have discovered a way to relay a tactile, 3D touch to a flat surfaces. In the same way a person can pass their finger over a 3D topographic map and feel the peaks and valleys against their skin, this new advancement could make the same interaction possible on touchscreen devices. By using haptic feedback and a clever algorithm, the researchers have developed a way to adjust the amount of friction felt by the skin as it passes over certain elements on the screen. This algorithm also works in real time; meaning specially built devices can provide the visually impaired with a sense of what objects are around them at any given point. Disney researchers plan to present their research at the ACM Symposium on User Interface Software and Technology next week in St. Andrews, Scotland.

Essentially, this new technology assigns images and videos with corresponding levels of feedback. The algorithm determines how much feedback needs to be associated with the bumps, curves and ridges tied to the given object, then the haptic display generates this feedback and passes it on to the finger.

“Our brain perceives the 3D bump on a surface mostly from information that it receives via skin stretching,” said Ivan Poupyrev, the director of Disney Research, Pittsburgh Interaction Group.

“Therefore, if we can artificially stretch skin on a finger as it slides on the touch screen, the brain will be fooled into thinking an actual physical bump is on a touch screen even though the touch surface is completely smooth.”

In a demonstration video researchers are shown passing their fingers over a topographic map and presumably feeling the ridges of the mountains as their fingers glide over the image. The technology also works in real time and can measure the way an object should feel and pass this along to the haptic screen. In the video, the researchers are shown pointing a tablet at an array of apples, a pineapple, and even a fire extinguisher as they feel the natural curves and ridges of these objects.

“The traditional approach to tactile feedback is to have a library of canned effects that are played back whenever a particular interaction occurs,” explained lead researcher Ali Israr in the press release.

“This makes it difficult to create a tactile feedback for dynamic visual content, where the sizes and orientation of features constantly change. With our algorithm we do not have one or two effects, but a set of controls that make it possible to tune tactile effects to a specific visual artifact on the fly.”

As mobile devices become smarter and are used in more situations, those with disabilities and impairments are beginning to rely on them to get through their daily lives. This kind of technology could not only increase the usefulness of these devices, but also increase the quality of life of those who need a little help every day. Tablets with this technology built in, for instance, could help the visually impaired get around a room and better understand their immediate environment.

“Touch interaction has become the standard for smartphones, tablets and even desktop computers, so designing algorithms that can convert the visual content into believable tactile sensations has immense potential for enriching the user experience,” said Poupyrev.

“We believe our algorithm will make it possible to render rich tactile information over visual content and that this will lead to new applications for tactile displays.”


Source: Michael Harper for redOrbit.com – Your Universe Online