Sciweavers

50 search results - page 6 / 10
» Gaze and Speech in Attentive User Interfaces
Sort
View
ATAL
2005
Springer
14 years 7 days ago
Evaluating the interaction with synthetic agents using attention and affect tracking
We motivate an approach to evaluating the utility of synthetic agents that is based on human physiology rather than questionnaires. The primary tool is an eye tracker that provide...
Helmut Prendinger, Chunling Ma, Jin Yingzi, Kushid...
HCI
2009
13 years 4 months ago
Using 3D Touch Interaction for a Multimodal Zoomable User Interface
Touchscreens are becoming the preferred input device in a growing number of applications. They are interesting devices which are more and more introduced into the automotive domain...
Florian Laquai, Markus Ablaßmeier, Tony Poit...
CHI
2010
ACM
14 years 1 months ago
Gazemarks: gaze-based visual placeholders to ease attention switching
Many tasks require attention switching. For example, searching for information on one sheet of paper and then entering this information onto another one. With paper we see that pe...
Dagmar Kern, Paul Marshall, Albrecht Schmidt
CHI
2004
ACM
14 years 7 months ago
ICARE: a component-based approach for the design and development of multimodal interfaces
Multimodal interactive systems support multiple interaction techniques such as the synergistic use of speech, gesture and eye gaze tracking. The flexibility they offer results in ...
Jullien Bouchet, Laurence Nigay
ICMI
2005
Springer
108views Biometrics» more  ICMI 2005»
14 years 5 days ago
Understanding the effect of life-like interface agents through users' eye movements
We motivate an approach to evaluating the utility of lifelike interface agents that is based on human eye movements rather than questionnaires. An eye tracker is employed to obtai...
Helmut Prendinger, Chunling Ma, Jin Yingzi, Arturo...