Sciweavers

971 search results - page 14 / 195
» Observing users in multimodal interaction
Sort
View
AIHC
2007
Springer
14 years 2 months ago
Gaze-X: Adaptive, Affective, Multimodal Interface for Single-User Office Scenarios
This paper describes an intelligent system that we developed to support affective multimodal human-computer interaction (AMM-HCI) where the user’s actions and emotions are modele...
Ludo Maat, Maja Pantic
WWW
2004
ACM
14 years 9 months ago
A multimodal interaction manager for device independent mobile applications
This poster presents an overview of the work on an interaction manager of a platform for multimodal applications in 2.5G and 3G mobile phone networks and WLAN environments. The po...
Florian Wegscheider, Thomas Dangl, Michael Jank, R...
AAMAS
2010
Springer
13 years 8 months ago
Teaching a pet-robot to understand user feedback through interactive virtual training tasks
Abstract In this paper, we present a human-robot teaching framework that uses "virtual" games as a means for adapting a robot to its user through natural interaction in a...
Anja Austermann, Seiji Yamada
ICMI
2010
Springer
189views Biometrics» more  ICMI 2010»
13 years 6 months ago
Gesture and voice prototyping for early evaluations of social acceptability in multimodal interfaces
Interaction techniques that require users to adopt new behaviors mean that designers must take into account social acceptability and user experience otherwise the techniques may b...
Julie Rico, Stephen A. Brewster
WEBNET
2000
13 years 9 months ago
MPML: A Multimodal Presentation Markup Language with Character Agent Control Functions
: As a new style of effective information presentations and a new multimodal information content production on the World Wide Web (WWW), multimodal presentation using interactive l...
Takayuki Tsutsui, Santi Saeyor, Mitsuru Ishizuka