We have developed a multi-modal virtual environment set-up by fusing visual and haptic images through the use of a new autostereoscopic display and a force-feedback haptic device. Most of the earlier visualization systems that integrate stereo vision and touch have utilized polarized or shutter glasses for stereovision. In this paper, we discuss the development stages and components of our set-up that allows a user to touch, feel, and manipulate virtual objects through a haptic device while seeing them in stereo without using any special eyewear. We also discuss the transformations involved in mapping the absolute coordinates of virtual objects into visual and haptic workspaces and the synchronization of cursor movements in these workspaces. Future applications of this work will include a) multi-modal visualization of planetary data and b) planning of space mission operations in virtual environments. Set-Up Our set-up is designed to create a multi-modal virtual environment that integr...
Cagatay Basdogan, Mitchell J. H. Lum, Jose Salcedo