Sciweavers

DAGM
2003
Springer

Real-Time Recognition of 3D-Pointing Gestures for Human-Machine-Interaction

14 years 4 months ago
Real-Time Recognition of 3D-Pointing Gestures for Human-Machine-Interaction
We present a system capable of visually detecting pointing gestures and estimating the 3D pointing direction in real-time. We use Hidden Markov Models (HMMs) trained on different phases of sample pointing gestures to detect the occurrence of a gesture. For estimating the pointing direction, we compare two approaches: 1) The line of sight between head and hand and 2) the forearm orientation. Input features for the HMMs are the 3D trajectories of the person’s head and hands. They are extracted from image sequences provided by a stereo camera. In a person-independent test scenario, our system achieved a gesture detection rate of 88%. For 90% of the detected gestures, the correct pointing target (one out of eight objects) was identified.
Kai Nickel, Rainer Stiefelhagen
Added 06 Jul 2010
Updated 06 Jul 2010
Type Conference
Year 2003
Where DAGM
Authors Kai Nickel, Rainer Stiefelhagen
Comments (0)