This paper describes an alternative form of interaction for mobile devices using crossmodal output. These crossmodal displays allow alternative senses such as hearing and touch to be used to perceive information normally presented to the visual modality. Initial experiments show that roughness and spatial location can be perceived as equivalent in both the auditory and tactile domain. This paper discusses how crossmodal displays can be constructed using the results from these experiments and the benefits they bring to mobile human computer interfaces .
Eve E. Hoggan, Stephen A. Brewster