Sciweavers

HCI
2009

Did I Get It Right: Head Gestures Analysis for Human-Machine Interactions

13 years 10 months ago
Did I Get It Right: Head Gestures Analysis for Human-Machine Interactions
This paper presents a system for another input modality in a multimodal human-machine interaction scenario. In addition to other common input modalities, e.g. speech, we extract head gestures by image interpretation techniques based on machine learning algorithms to have a nonverbal and familiar way of interacting with the system. Our experimental evaluation proofs the capability of the presented approach to work in real-time and reliable.1 1 Motivation Multimodal communication ways are becoming more important for a robust and flexible human-machine interaction in everyday surroundings. Therefore, our objective is to introduce a communication channel providing a natural and intuitive way of simple nonverbal communication. It emulates a common way of showing agreement / disagreement via head gestures, like it is known from human-human dialogs. Due to its simplicity and omnipresence in every-day life, this contributes to making dialog systems more efficient. 2 Use-Cases As already mentio...
Jürgen Gast, Alexander Bannat, Tobias Rehrl,
Added 18 Feb 2011
Updated 18 Feb 2011
Type Journal
Year 2009
Where HCI
Authors Jürgen Gast, Alexander Bannat, Tobias Rehrl, Gerhard Rigoll, Frank Wallhoff, Christoph Mayer, Bernd Radig
Comments (0)