Sciweavers

67 search results - page 4 / 14
» Authoring pervasive multimodal user interfaces
Sort
View
AVI
2006
13 years 8 months ago
Enabling interaction with single user applications through speech and gestures on a multi-user tabletop
Co-located collaborators often work over physical tabletops with rich geospatial information. Previous research shows that people use gestures and speech as they interact with art...
Edward Tse, Chia Shen, Saul Greenberg, Clifton For...
CHI
2010
ACM
14 years 1 months ago
A simple index for multimodal flexibility
Most interactive tasks engage more than one of the user’s exteroceptive senses and are therefore multimodal. In real-world situations with multitasking and distractions, the key...
Antti Oulasvirta, Joanna Bergstrom-Lehtovirta
CHI
2005
ACM
14 years 7 months ago
Evaluation of multimodal input for entering mathematical equations on the computer
Current standard interfaces for entering mathematical equations on computers are arguably limited and cumbersome. Mathematics notations have evolved to aid visual thinking and yet...
Lisa Anthony, Jie Yang, Kenneth R. Koedinger
CHI
2005
ACM
14 years 7 months ago
Conversing with the user based on eye-gaze patterns
Motivated by and grounded in observations of eye-gaze patterns in human-human dialogue, this study explores using eye-gaze patterns in managing human-computer dialogue. We develop...
Pernilla Qvarfordt, Shumin Zhai
ICMI
2004
Springer
196views Biometrics» more  ICMI 2004»
14 years 2 days ago
Evaluation of spoken multimodal conversation
Spoken multimodal dialogue systems in which users address faceonly or embodied interface agents have been gaining ground in research for some time. Although most systems are still...
Niels Ole Bernsen, Laila Dybkjær