CrowdOptic and Carnegie Mellon University partner to create Visual Breadcrumb Trails in Maps
SAN FRANCISCO, Aug. 19, 2013 /PRNewswire/ — CrowdOptic, a maker of crowd-powered mobile applications, today announced it has teamed with researchers at Carnegie Mellon University’s Silicon Valley campus who are creating technology for “smart” buildings and “smart” environments based on the Internet of Things (IOT). Carnegie Mellon Silicon Valley researchers have been developing technology to link mobile users to the “smart environment” to provide novel and useful targeted information and services. CrowdOptic’s focus of attention technology will be used to create enhanced functionality based on knowledge of what people are paying attention to. Initially Carnegie Mellon Silicon Valley researchers will work with CrowdOptic to develop a new mapping technology that would allow users of wearable electronic devices to create and share visual breadcrumb trails in order to find and follow each other in indoor locations and in apps such as Google Maps.
Whereas most mapping programs already offer guided turn-by-turn GPS navigation, this new capability would be entirely new and different: It would allow users to see and follow a visual trail within their line of sight through the viewer of a wearable smart device like Google Glass.
Breadcrumbing is the process by which a GPS-enabled device collects historical location data including route, speed, direction and stops. The data are then presented on a map as a breadcrumb trail of position markers. CrowdOptic and Carnegie Mellon Silicon Valley are working to integrate breadcrumbing into wearable devices by making the feature both visual and dynamic.
“Wearable devices are going to change the way people interact with maps,” said CrowdOptic CEO Jon Fisher. “Routes and directions will need to be delivered visually to quite literally point the way and they will need to continually adjust in real time to always reflect the safest, most efficient path.”
Dynamic factors include the wearer’s sightline, interaction with other wearers, and changing conditions of the environment, as reflected by the changing paths and behaviors of other wearers. CrowdOptic has patented the technology (U.S. Patent No. 8,527,340) that captures these dynamic shifts in where people are looking through their electronic devices like Google Glass.
Carnegie Mellon Silicon Valley expects knowledge of users’ focus of attention to support interactions based around individual situational awareness, which will support enhanced targeting of information and services from the “smart” environment to the user.
CrowdOptic envisions eventually being able to power applications that would lead people to locations where groups are congregating, news is breaking, or friends are meeting. The technology would allow friends to share their routes with each other, especially in complex multi-building environments such as universities or shopping malls, and it would allow people to reroute themselves in response to changing environmental conditions and the paths of other Glass wearers. CrowdOptic’s permission-based “find friend” augmented reality apps which are in the market currently, and which use smartphones to locate people on a map or in a stadium, will also benefit from this next generation of mapping technology.
“We are extremely privileged to be working with Carnegie Mellon Silicon Valley in this exciting area of innovation,” said Fisher. “We are immensely pleased to have partnered with one of the top technology universities in the world and a team which has distinguished itself by its unique leadership in exploring new applications of smart sensor technology.”
“We are excited by the opportunity to work with CrowdOptic to explore new applications in emergency response and in making ‘smart’ buildings smarter,” said Martin Griss, Director of CMU Silicon Valley. “The ability to precisely detect the location, orientation and history of movement of individuals afforded by this technology will enable many new context-aware services.”
“We see opportunities to use augmented reality to overlay important information from sensors, people, and building services, precisely targeted to where the individual is looking. This will significantly enhance users’ experiences, while optimizing service delivery,” says Dr. Steven Rosenberg, Associate Director of CMU Silicon Valley.
Said Brent Iadarola, Global Research Director Mobile & Wireless Communications for Frost & Sullivan:
“The combination of the wearable glasses interface and new mapping functionalities from CrowdOptic and CMU is a compelling advancement that represents a significant step in the next generation of mobile mapping applications.”
CrowdOptic is a new crowd-powered “heat” signal – the next evolution of location-based services that recognizes the hottest crowd activity in real time as it occurs, and also after the fact, through powerful analytics. By tracking where electronic devices are located and where they are pointed, CrowdOptic can instantly filter mobile media and create new opportunities for eyewitness engagement. CrowdOptic identifies, tags and rebroadcasts the live event experience to the world, on mobile, social TV and second screens. CrowdOptic is used around the world to power a wide range of apps that let users aim their phones to connect, report news, find friends, alternate broadcasts, cast a vote and discover others who share their focus and interests. CrowdOptic is a privately-held, venture-backed company based in San Francisco. www.crowdoptic.com.