Sciweavers

116 search results - page 3 / 24
» Designing and Prototyping Multimodal Commands
Sort
View
IUI
2006
ACM
14 years 16 days ago
Head gesture recognition in intelligent interfaces: the role of context in improving recognition
Acknowledging an interruption with a nod of the head is a natural and intuitive communication gesture which can be performed without significantly disturbing a primary interface ...
Louis-Philippe Morency, Trevor Darrell
LREC
2008
128views Education» more  LREC 2008»
13 years 8 months ago
On the Role of the NIMITEK Corpus in Developing an Emotion Adaptive Spoken Dialogue System
This paper reports on the creation of the multimodal NIMITEK corpus of affected behavior in human-machine interaction and its role in the development of the NIMITEK prototype syst...
Milan Gnjatovic, Dietmar Rösner
VLDB
2007
ACM
134views Database» more  VLDB 2007»
14 years 21 days ago
Challenges and Experience in Prototyping a Multi-Modal Stream Analytic and Monitoring Application on System S
In this paper, we describe the challenges of prototyping a reference application on System S, a distributed stream processing middleware under development at IBM Research. With a ...
Kun-Lung Wu, Philip S. Yu, Bugra Gedik, Kirsten Hi...
CA
1997
IEEE
13 years 10 months ago
Layered Modular Action Control for Communicative Humanoids
Face-to-face interaction between people is generally effortless and effective. We exchange glances, take turns speaking and make facial and manual gestures to achieve the goals of ...
Kristinn R. Thórisson
CHI
2004
ACM
14 years 7 months ago
ICARE: a component-based approach for the design and development of multimodal interfaces
Multimodal interactive systems support multiple interaction techniques such as the synergistic use of speech, gesture and eye gaze tracking. The flexibility they offer results in ...
Jullien Bouchet, Laurence Nigay