Auditory display research for driving has mainly focused on collision warning signals, and recent studies on auditory invehicle information presentation have examined only a limited range of tasks (e.g., cell phone operation tasks or verbal tasks such as reading digit strings). The present study used a dual task paradigm to evaluate a plausible scenario in which users navigated a song list. We applied enhanced auditory menu navigation cues, including spearcons (i.e., compressed speech) and a spindex (i.e., a speech index that used brief audio cues to communicate the user’s position in a long menu list). Twentyfour undergraduates navigated through an alphabetized song list of 150 song titles—rendered as an auditory menu—while they concurrently played a simple, perceptual-motor, ball-catching game. The menu was presented with text-to-speech (TTS) alone, TTS plus one of three types of enhanced auditory cues, or no sound at all. Both performance of the primary task (success rate of ...
Myounghoon Jeon, Benjamin K. Davison, Michael A. N