Sciweavers

262 search results - page 7 / 53
» The Integrality of Speech in Multimodal Interfaces
Sort
View
CHI
2004
ACM
14 years 10 months ago
ICARE: a component-based approach for the design and development of multimodal interfaces
Multimodal interactive systems support multiple interaction techniques such as the synergistic use of speech, gesture and eye gaze tracking. The flexibility they offer results in ...
Jullien Bouchet, Laurence Nigay
ACL
2007
13 years 11 months ago
A Multimodal Interface for Access to Content in the Home
In order to effectively access the rapidly increasing range of media content available in the home, new kinds of more natural interfaces are needed. In this paper, we explore the ...
Michael Johnston, Luis Fernando D'Haro, Michelle L...
HCI
2007
13 years 11 months ago
Unobtrusive Multimodal Emotion Detection in Adaptive Interfaces: Speech and Facial Expressions
Two unobtrusive modalities for automatic emotion recognition are discussed: speech and facial expressions. First, an overview is given of emotion recognition studies based on a com...
Khiet P. Truong, David A. van Leeuwen, Mark A. Nee...
IUI
2010
ACM
14 years 6 months ago
Usage patterns and latent semantic analyses for task goal inference of multimodal user interactions
This paper describes our work in usage pattern analysis and development of a latent semantic analysis framework for interpreting multimodal user input consisting speech and pen ge...
Pui-Yu Hui, Wai Kit Lo, Helen M. Meng
BCSHCI
2008
13 years 11 months ago
Efficiency of multimodal metaphors in the presentation of learning information
The comparative study described in this paper has been conducted to investigate the effect of including multimodal metaphors on the usability of e-learning interfaces. Two indepen...
Marwan Alseid, Dimitrios Rigas