We have implemented a computer interface that renders synchronized auditory and haptic stimuli with very low (0.5ms) latency. The audio and haptic interface (AHI) includes a Pantograph haptic device that reads position input from a user and renders force output based on this input. We synthesize audio by convolving the force profile generated by user interaction with the impulse response of the virtual surface. Auditory and haptic modes are tightly coupled because we produce both stimuli from the same force profile. We have conducted a user study with the AHI to verify that the 0.5ms system latency lies below the perceptual threshold for detecting separation between auditory and haptic contact events. We discuss future applications of the AHI for further perceptual studies and for synthesizing continuous contact interactions in virtual environments.
Derek DiFilippo, Dinesh K. Pai