We propose algorithms and applications for using the hand as an interface device in virtual and physical spaces. In virtual drawing, by tracking the hand in 3D and estimating a virtual plane in space, the intended drawing of user is recognized. In a virtual marble game, the instantaneous orientation of the hand is simulated to render a graphical scene of the game board. Real-time visual feedback allows the user to navigate a virtual ball in a maze. In 3-D model construction, the system tracks the hand motion in space while the user is traversing edges of a physical object. The object is then rendered virtually by the computer. These applications involve estimating the 3-D absolute position and/or orientation of the hand in space. We propose parametric modelling of the central region of the hand to extract this information. A stereo camera is used to first build a preliminary disparity map of the hand. Then, the best fitting plane to the disparity points is computed using robust estimat...
Afshin Sepehri, Yaser Yacoob, Larry S. Davis