Sciweavers

13 search results - page 1 / 3
» Unobtrusive Multimodal Emotion Detection in Adaptive Interfa...
Sort
View
HCI
2007
14 years 8 days ago
Unobtrusive Multimodal Emotion Detection in Adaptive Interfaces: Speech and Facial Expressions
Two unobtrusive modalities for automatic emotion recognition are discussed: speech and facial expressions. First, an overview is given of emotion recognition studies based on a com...
Khiet P. Truong, David A. van Leeuwen, Mark A. Nee...
AIHC
2007
Springer
14 years 5 months ago
Modeling Naturalistic Affective States Via Facial, Vocal, and Bodily Expressions Recognition
Affective and human-centered computing have attracted a lot of attention during the past years, mainly due to the abundance of devices and environments able to exploit multimodal i...
Kostas Karpouzis, George Caridakis, Loïc Kess...
AMDO
2006
Springer
14 years 2 months ago
Emotional Facial Expression Classification for Multimodal User Interfaces
We present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of 10 characteristic points (that ar...
Eva Cerezo, Isabelle Hupont
ICMI
2004
Springer
263views Biometrics» more  ICMI 2004»
14 years 4 months ago
Analysis of emotion recognition using facial expressions, speech and multimodal information
The interaction between human beings and computers will be more natural if computers are able to perceive and respond to human non-verbal communication such as emotions. Although ...
Carlos Busso, Zhigang Deng, Serdar Yildirim, Murta...
AIHC
2007
Springer
14 years 5 months ago
Gaze-X: Adaptive, Affective, Multimodal Interface for Single-User Office Scenarios
This paper describes an intelligent system that we developed to support affective multimodal human-computer interaction (AMM-HCI) where the user’s actions and emotions are modele...
Ludo Maat, Maja Pantic