Sciweavers

152 search results - page 3 / 31
» Gesture Recognition for Human-Computer Interaction (HCI)
Sort
View
GI
2004
Springer
14 years 28 days ago
Communicating Agents Architecture with Applications in Multimodal Human Computer Interaction
: We present our idea of solving parts of the vision task with an organic computing approach. We have designed a multiagent system (MAS) of many different modules working on differ...
Maximilian Krüger, Achim Schäfer, Andrea...
CHI
1993
ACM
13 years 11 months ago
Extending an existing user interface toolkit to support gesture recognition
Gestures are a powerful way to specify both objects and operations with a single mark of a stylus or mouse. We have extended an existing user interface toolkit to support gestures...
James A. Landay, Brad A. Myers
CHI
2010
ACM
14 years 2 months ago
Scale detection for a priori gesture recognition
Gesture-based interfaces provide expert users with an efficient form of interaction but they require a learning effort for novice users. To address this problem, some on-line gui...
Caroline Appert, Olivier Bau
ICCV
2005
IEEE
14 years 1 months ago
Multimodal Human Computer Interaction: A Survey
Abstract. In this paper we review the major approaches to multimodal human computer interaction from a computer vision perspective. In particular, we focus on body, gesture, gaze, ...
Alejandro Jaimes, Nicu Sebe
TSD
2005
Springer
14 years 1 months ago
The Role of Speech in Multimodal Human-Computer Interaction
Abstract. Natural audio-visual interface between human user and machine requires understanding of user’s audio-visual commands. This does not necessarily require full speech and ...
Hynek Hermansky, Petr Fousek, Mikko Lehtonen