Chirocentric 3D user interfaces are sometimes hailed as the “holy grail” of human-computer interaction. However, implementations of these UIs can require cumbersome devices (such as tethered wearable datagloves), be limited in terms of functionality or obscure the algorithms used for hand pose and gesture recognition. These limitations inhibit designing, deploying and formally evaluating such interfaces. To ameliorate this situation, we describe the implementation of a practical chirocentric UI platform, targeted at immersive virtual environments with infrared tracking systems. Our main contributions are two machine learning techniques for the recognition of hand gestures (trajectories of the user’s hands over time) and hand poses (configurations of the user’s fingers) based on marker clouds and rigid body data. We report on the preliminary use of our system for the implementation of a bimanual 3DUI for a large immersive tiled display. We conclude with plans on using our sys...
Charilaos Papadopoulos, H. Choi, J. Sinha, K. Yun,