Sciweavers

AMDO
2006
Springer

Emotional Facial Expression Classification for Multimodal User Interfaces

14 years 3 months ago
Emotional Facial Expression Classification for Multimodal User Interfaces
We present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of 10 characteristic points (that are part of the MPEG4 feature points) to extract relevant emotional information (basically five distances, presence of wrinkles and mouth shape). The method defines and detects the six basic emotions (plus the neutral one) in terms of this information and has been fine-tuned with a data-base of 399 images. For the moment, the method is applied to static images. Application to sequences is being now developed. The extraction of such information about the user is of great interest for the development of new multimodal user interfaces. Keywords. Facial Expression, Multimodal Interface.
Eva Cerezo, Isabelle Hupont
Added 20 Aug 2010
Updated 20 Aug 2010
Type Conference
Year 2006
Where AMDO
Authors Eva Cerezo, Isabelle Hupont
Comments (0)