Sciweavers

125 search results - page 6 / 25
» Analysis of emotion recognition using facial expressions, sp...
Sort
View
ICMCS
2006
IEEE
187views Multimedia» more  ICMCS 2006»
14 years 1 months ago
Combined Gesture-Speech Analysis and Speech Driven Gesture Synthesis
Multimodal speech and speaker modeling and recognition are widely accepted as vital aspects of state of the art human-machine interaction systems. While correlations between speec...
Mehmet Emre Sargin, Oya Aran, Alexey Karpov, Ferda...
AVI
2008
13 years 10 months ago
Exploring emotions and multimodality in digitally augmented puppeteering
Recently, multimodal and affective technologies have been adopted to support expressive and engaging interaction, bringing up a plethora of new research questions. Among the chall...
Lassi A. Liikkanen, Giulio Jacucci, Eero Huvio, To...
ICMI
2005
Springer
429views Biometrics» more  ICMI 2005»
14 years 1 months ago
A first evaluation study of a database of kinetic facial expressions (DaFEx)
In this paper we present DaFEx (Database of Facial Expressions), a database created with the purpose of providing a benchmark for the evaluation of the facial expressivity of Embo...
Alberto Battocchi, Fabio Pianesi, Dina Goren-Bar
ICDE
2006
IEEE
262views Database» more  ICDE 2006»
14 years 1 months ago
The eNTERFACE'05 Audio-Visual Emotion Database
This paper presents an audio-visual emotion database that can be used as a reference database for testing and evaluating video, audio or joint audio-visual emotion recognition alg...
O. Martin, Irene Kotsia, Benoit M. Macq, Ioannis P...
COST
2008
Springer
99views Multimedia» more  COST 2008»
13 years 9 months ago
Towards Facial Gestures Generation by Speech Signal Analysis Using HUGE Architecture
In our current work we concentrate on finding correlation between speech signal and occurrence of facial gestures. Motivation behind this work is computer-generated human correspon...
Goranka Zoric, Karlo Smid, Igor S. Pandzic