Sciweavers

39 search results - page 1 / 8
» Integrating Semantics into Multimodal Interaction Patterns
Sort
View
MLMI
2007
Springer
14 years 1 months ago
Integrating Semantics into Multimodal Interaction Patterns
A user experiment on multimodal interaction (speech, hand position and hand shapes) to study two major relationships: between the level of cognitive load experienced by users and t...
Ronnie Taib, Natalie Ruiz
IUI
2010
ACM
14 years 4 months ago
Usage patterns and latent semantic analyses for task goal inference of multimodal user interactions
This paper describes our work in usage pattern analysis and development of a latent semantic analysis framework for interpreting multimodal user input consisting speech and pen ge...
Pui-Yu Hui, Wai Kit Lo, Helen M. Meng
MLMI
2005
Springer
14 years 29 days ago
Multimodal Integration for Meeting Group Action Segmentation and Recognition
We address the problem of segmentation and recognition of sequences of multimodal human interactions in meetings. These interactions can be seen as a rough structure of a meeting, ...
Marc Al-Hames, Alfred Dielmann, Daniel Gatica-Pere...
ICMI
2003
Springer
128views Biometrics» more  ICMI 2003»
14 years 20 days ago
Modeling multimodal integration patterns and performance in seniors: toward adaptive processing of individual differences
Multimodal interfaces are designed with a focus on flexibility, although very few currently are capable of adapting to major sources of user, task, or environmental variation. The...
Benfang Xiao, Rebecca Lunsford, Rachel Coulston, R...
ISMAR
2006
IEEE
14 years 1 months ago
"Move the couch where?" : developing an augmented reality multimodal interface
This paper describes an augmented reality (AR) multimodal interface that uses speech and paddle gestures for interaction. The application allows users to intuitively arrange virtu...
Sylvia Irawati, Scott Green, Mark Billinghurst, An...