Sciweavers

25 search results - page 4 / 5
» Multimodal information fusion for human-robot interaction
Sort
View
ISMAR
2006
IEEE
14 years 2 months ago
"Move the couch where?" : developing an augmented reality multimodal interface
This paper describes an augmented reality (AR) multimodal interface that uses speech and paddle gestures for interaction. The application allows users to intuitively arrange virtu...
Sylvia Irawati, Scott Green, Mark Billinghurst, An...
ICMI
2004
Springer
263views Biometrics» more  ICMI 2004»
14 years 2 months ago
Analysis of emotion recognition using facial expressions, speech and multimodal information
The interaction between human beings and computers will be more natural if computers are able to perceive and respond to human non-verbal communication such as emotions. Although ...
Carlos Busso, Zhigang Deng, Serdar Yildirim, Murta...
AIHC
2007
Springer
14 years 3 months ago
SmartWeb Handheld - Multimodal Interaction with Ontological Knowledge Bases and Semantic Web Services
SMARTWEB aims to provide intuitive multimodal access to a rich selection of Web-based information services. We report on the current prototype with a smartphone client interface t...
Daniel Sonntag, Ralf Engel, Gerd Herzog, Alexander...
ICMI
2003
Springer
128views Biometrics» more  ICMI 2003»
14 years 2 months ago
Modeling multimodal integration patterns and performance in seniors: toward adaptive processing of individual differences
Multimodal interfaces are designed with a focus on flexibility, although very few currently are capable of adapting to major sources of user, task, or environmental variation. The...
Benfang Xiao, Rebecca Lunsford, Rachel Coulston, R...
ICMI
2009
Springer
126views Biometrics» more  ICMI 2009»
14 years 3 months ago
Multimodal floor control shift detection
Floor control is a scheme used by people to organize speaking turns in multi-party conversations. Identifying the floor control shifts is important for understanding a conversati...
Lei Chen 0004, Mary P. Harper