Solution To Reducing Driver Distraction Likely Resides In Combining A Variety Of Driver Interfaces To Fit Specific Tasks
Integrating Vehicle’s Talk Button with Touch Screen Should Improve Overall Performance
DALLAS, June 3, 2013 /PRNewswire/ — New federal guidelines developed to minimize visual and manual distractions of drivers on the dashboard are beginning to validate the need for interactive voice commands for performing tasks secondary to driving.
More immediate reductions in distractions are likely to result, however, by incorporating voice (or speech) technology with a mix of audio, visual (including heads-up displays), manual, haptic (vibrations), and augmented reality interfaces, according to connected vehicle services provider Agero. Agero provides private-labeled and cloud-based infotainment, navigation, and safety services to multiple automotive brands through a variety of in-vehicle and off-board interfaces with vehicle owners.
Thomas Schalk, Agero’s vice president of Voice Technology, says finding the right combination of interdependent interfaces is where the cutting edge of in-vehicle, human-machine-interface (HMI) research is leading.
“Reducing distraction will require matching the right blend of natural interfaces that can successfully and quickly perform specific, independent actions–such as task selection, list management, entering text string, understanding warnings, interrupting or pausing a task, resuming a task, and completing a task–which are required to perform a growing assortment of in-vehicle, non-driving tasks,” said Schalk.
While a growing body of evidence from research is pointing to the importance of interactive speech systems in vehicles to keep drivers eyes on the road and hands on the wheel, the research also reveals the need to avoid voice menus and minimize the amount of speech interaction for drivers. Both actions tend to extend the duration of non-driving tasks, thereby increasing the risk of driver distraction.
A common problem encountered with in-vehicle speech-only interfaces is that a driver often doesn’t know what to say in response to the talk button’s “please say a command” voice prompt, thereby confusing the speech system as it listens for a response. Unexpected sounds within the vehicle during this listening mode also can confound the system. Both issues can trigger the system to produce seemingly inaccurate results, generating driver frustration, which in turn can result in driver distration or early abandonment of using the system. One promising approach to overcoming such shortcomings in speech-only interfaces is by integrating the vehicle’s talk button (commonly found on the steering wheel) with the vehicle’s touch screen, providing the driver with a simple, more instructive Tap-or-Say prompt.
“Verbally coaching the driver what to say–a speech system’s common response when an error occurs–extends the duration time to complete the task, thereby increasing the potential for driver frustration, leading to distraction,” said Schalk. “With the Tap-or-Say command, the user instinctively glances and taps from a list of results displayed on a touch screen without the need to contemplate a spoken response. No extra prompting and no extra dialog steps are required, dramatically reducing the task completion time and the risk of distraction.”
Schalk says interactive speech will remain a critical interface in the moving vehicle -primarily as a substitute for typing text.
The guidelines issued by the National Highway Traffic Safety Administration (NHTSA) offer specifics such as ensuring in-vehicle infotainment and communications systems do not divert drivers’ attention away from the roadway for more than two seconds at a time, or 12 seconds in total. Schalk expands upon this and identifies an emerging list of best practices for the next generation of automotive user interfaces:
- Maximize simplicity
- Maximize interruptibility
- Minimize the number of task steps
- Minimize the number of menu layers
- Restrict manual text entry
- Minimize incoming messages
- Minimize verbosity
- Remove the need for learning mode
- Minimize glance duration
- Minimize glance frequency
- Minimize task completion time
A key aspect of rethinking the interaction between cars and people (the HMI) is to realize that not all tasks, such as quick retrieval of navigation information in contrast to exploring music choices, are not created equal. Likewise, driver focus also varies widely by age, driving experience, and behavior.
Agero Connected Services (ACS) is a leading provider of private-labeled, connected vehicle services for the automotive, insurance, and aftermarket industries. Based in the Dallas, Texas, area, ACS launched the connected car market over 15 years ago. ACS is a division of Medford, Mass.-based Agero, Inc., the leader in roadside assistance, claims management, and emergency services in the automotive and insurance industries. For more information, visit www.agero.com.