Sciweavers

971 search results - page 17 / 195
» Observing users in multimodal interaction
Sort
View
CHI
2009
ACM
14 years 9 months ago
A biologically inspired approach to learning multimodal commands and feedback for human-robot interaction
In this paper we describe a method to enable a robot to learn how a user gives commands and feedback to it by speech, prosody and touch. We propose a biologically inspired approac...
Anja Austermann, Seiji Yamada
AAAI
2000
13 years 9 months ago
Cognitive Status and Form of Reference in Multimodal Human-Computer Interaction
We analyze a corpus of referring expressions collected from user interactions with a multimodal travel guide application. The analysis suggests that, in dramatic contrast to norma...
Andrew Kehler
IVA
2010
Springer
13 years 6 months ago
Multimodal Backchannels for Embodied Conversational Agents
One of the most desirable characteristics of an Embodied Conversational Agent (ECA) is the capability of interacting with users in a human-like manner. While listening to a user, a...
Elisabetta Bevacqua, Sathish Pammi, Sylwia Julia H...
MM
2005
ACM
116views Multimedia» more  MM 2005»
14 years 2 months ago
Affective multimodal human-computer interaction
Social and emotional intelligence are aspects of human intelligence that have been argued to be better predictors than IQ for measuring aspects of success in life, especially in s...
Maja Pantic, Nicu Sebe, Jeffrey F. Cohn, Thomas S....
IUI
2000
ACM
14 years 24 days ago
Expression constraints in multimodal human-computer interaction
Thanks to recent scientific advances, it is now possible to design multimodal interfaces allowing the use of speech and pointing out gestures on a touchscreen. However, present sp...
Sandrine Robbe-Reiter, Noelle Carbonell, Pierre Da...