Sciweavers

521 search results - page 38 / 105
» Affective multimodal human-computer interaction
Sort
View
CVPR
2005
IEEE
15 years 9 months ago
Audio-Visual Affect Recognition through Multi-Stream Fused HMM for HCI
Advances in computer processing power and emerging algorithms are allowing new ways of envisioning Human Computer Interaction. This paper focuses on the development of a computing...
Zhihong Zeng, Jilin Tu, Brian Pianfetti, Ming Liu,...
CHI
2009
ACM
15 years 8 months ago
Correlating low-level image statistics with users - rapid aesthetic and affective judgments of web pages
In this paper, we report a study that examines the relationship between image-based computational analyses of web pages and users’ aesthetic judgments about the same image mater...
Xianjun Sam Zheng, Ishani Chakraborty, James Jeng-...
AUTOMOTIVEUI
2009
ACM
15 years 8 months ago
Glancing at personal navigation devices can affect driving: experimental results and design implications
Nowadays, personal navigation devices (PNDs) that provide GPSbased directions are widespread in vehicles. These devices typically display the real-time location of the vehicle on ...
Andrew L. Kun, Tim Paek, Zeljko Medenica, Nemanja ...
CHI
2010
ACM
15 years 11 months ago
Knowing where and when to look in a time-critical multimodal dual task
Human-computer systems intended for time-critical multitasking need to be designed with an understanding of how humans can coordinate and interleave perceptual, memory, and motor ...
Anthony J. Hornof, Yunfeng Zhang, Tim Halverson
NORDICHI
2006
ACM
15 years 10 months ago
Tac-tiles: multimodal pie charts for visually impaired users
Tac-tiles is an accessible interface that allows visually impaired users to browse graphical information using tactile and audio feedback. The system uses a graphics tablet which ...
Steven A. Wall, Stephen A. Brewster