Sciweavers

437 search results - page 33 / 88
» Multimodal Human Computer Interaction: A Survey
Sort
View
ICMI
2004
Springer
263views Biometrics» more  ICMI 2004»
14 years 3 months ago
Analysis of emotion recognition using facial expressions, speech and multimodal information
The interaction between human beings and computers will be more natural if computers are able to perceive and respond to human non-verbal communication such as emotions. Although ...
Carlos Busso, Zhigang Deng, Serdar Yildirim, Murta...
CHI
2005
ACM
14 years 10 months ago
Conversing with the user based on eye-gaze patterns
Motivated by and grounded in observations of eye-gaze patterns in human-human dialogue, this study explores using eye-gaze patterns in managing human-computer dialogue. We develop...
Pernilla Qvarfordt, Shumin Zhai
NORDICHI
2006
ACM
14 years 3 months ago
Tac-tiles: multimodal pie charts for visually impaired users
Tac-tiles is an accessible interface that allows visually impaired users to browse graphical information using tactile and audio feedback. The system uses a graphics tablet which ...
Steven A. Wall, Stephen A. Brewster
CMC
1998
Springer
14 years 2 months ago
Multimodal Reference to Objects: An Empirical Approach
Abstract. In this chapter we report on an investigation into the principles underlying the choice of a particular referential expression to refer to an object located in a domain t...
Robbert-Jan Beun, Anita H. M. Cremers
ICMCS
2005
IEEE
116views Multimedia» more  ICMCS 2005»
14 years 3 months ago
Multimodal Emotion Recognition and Expressivity Analysis
The paper presents the framework of a special session that aims at investigating the best possible techniques for multimodal emotion recognition and expressivity analysis in human...
Stefanos D. Kollias, Kostas Karpouzis