Sciweavers

HCI
2007

Unobtrusive Multimodal Emotion Detection in Adaptive Interfaces: Speech and Facial Expressions

14 years 1 months ago
Unobtrusive Multimodal Emotion Detection in Adaptive Interfaces: Speech and Facial Expressions
Two unobtrusive modalities for automatic emotion recognition are discussed: speech and facial expressions. First, an overview is given of emotion recognition studies based on a combination of speech and facial expressions. We will identify difficulties concerning data collection, data fusion, system evaluation and emotion annotation that one is most likely to encounter in emotion recognition research. Further, we identify some of the possible applications for emotion recognition such as health monitoring or e-learning systems. Finally, we will discuss the growing need for developing agreed standards in automatic emotion recognition research.
Khiet P. Truong, David A. van Leeuwen, Mark A. Nee
Added 29 Oct 2010
Updated 29 Oct 2010
Type Conference
Year 2007
Where HCI
Authors Khiet P. Truong, David A. van Leeuwen, Mark A. Neerincx
Comments (0)