Sciweavers

521 search results - page 17 / 105
» Affective multimodal human-computer interaction
Sort
View
135
Voted
CHI
2010
ACM
15 years 10 months ago
A longitudinal study of how highlighting web content change affects people's web interactions
The Web is constantly changing, but most tools used to access Web content deal only with what can be captured at a single instance in time. As a result, Web users may not have a g...
Jaime Teevan, Susan T. Dumais, Daniel J. Liebling
IUI
2003
ACM
15 years 9 months ago
Affective multi-modal interfaces: the case of McGurk effect
This study is motivated by the increased need to understand human response to video-links, 3G telephony and avatars. We focus on response of participants to audiovisual presentati...
Azra N. Ali, Philip H. Marsden
HRI
2009
ACM
15 years 10 months ago
Planning as an architectural control mechanism
We describe recent work on PECAS, an architecture for intelligent robotics that supports multi-modal interaction. Categories and Subject Descriptors I.2.8 [Computing Methodologies...
Nick Hawes, Michael Brenner, Kristoffer Sjö&o...
CHI
2000
ACM
15 years 8 months ago
Face to interface: facial affect in (hu)man and machine
Facial expression of emotion (or "facial affect") is rapidly becoming an area of intense interest in the computer science and interaction design communities. Ironically,...
Diane J. Schiano, Sheryl M. Ehrlich, Krisnawan Rah...
127
Voted
CHI
2005
ACM
16 years 4 months ago
eMoto: affectively involving both body and mind
It is known that emotions are experienced by both body and mind. Oftentimes, emotions are evoked by sub-symbolic stimuli, such as colors, shapes, gestures, or music. We have built...
Anna Ståhl, Kristina Höök, Petra S...