Sciweavers

971 search results - page 4 / 195
» Observing users in multimodal interaction
Sort
View
JDCTA
2008
102views more  JDCTA 2008»
13 years 7 months ago
Human Factors and Design Issues in Multimodal (Speech/Gesture) Interface
Multimodal interfaces are the emerging technology that offers expressive, transparent, efficient, robust, and mobile human-computer interaction. In this paper, we described the sp...
C. J. Lim, Younghwan Pan, Jane Lee
IUI
2010
ACM
14 years 4 months ago
Usage patterns and latent semantic analyses for task goal inference of multimodal user interactions
This paper describes our work in usage pattern analysis and development of a latent semantic analysis framework for interpreting multimodal user input consisting speech and pen ge...
Pui-Yu Hui, Wai Kit Lo, Helen M. Meng
CHI
1993
ACM
13 years 11 months ago
A design space for multimodal systems: concurrent processing and data fusion
Multimodal interaction enables the user to employ different modalities such as voice, gesture and typing for communicating with a computer. This paper presents an analysis of the ...
Laurence Nigay, Joëlle Coutaz
EHCI
2004
13 years 8 months ago
A Novel Dialog Model for the Design of Multimodal User Interfaces
The wide range of different devices with varying capabilities and interaction modalities as well as changing user context in nomadic applications, poses critical challenges to the ...
Robbie Schaefer, Steffen Bleul, Wolfgang Müll...
AVI
2006
13 years 8 months ago
Enabling interaction with single user applications through speech and gestures on a multi-user tabletop
Co-located collaborators often work over physical tabletops with rich geospatial information. Previous research shows that people use gestures and speech as they interact with art...
Edward Tse, Chia Shen, Saul Greenberg, Clifton For...