Sciweavers

19 search results - page 2 / 4
» Multimodal Complex Emotions: Gesture Expressivity and Blende...
Sort
View
ICCV
2005
IEEE
14 years 29 days ago
Multimodal Human Computer Interaction: A Survey
Abstract. In this paper we review the major approaches to multimodal human computer interaction from a computer vision perspective. In particular, we focus on body, gesture, gaze, ...
Alejandro Jaimes, Nicu Sebe
CVIU
2007
108views more  CVIU 2007»
13 years 7 months ago
Multimodal human-computer interaction: A survey
Abstract. In this paper we review the major approaches to multimodal human computer interaction from a computer vision perspective. In particular, we focus on body, gesture, gaze, ...
Alejandro Jaimes, Nicu Sebe
AIHC
2007
Springer
14 years 1 months ago
Modeling Naturalistic Affective States Via Facial, Vocal, and Bodily Expressions Recognition
Affective and human-centered computing have attracted a lot of attention during the past years, mainly due to the abundance of devices and environments able to exploit multimodal i...
Kostas Karpouzis, George Caridakis, Loïc Kess...
ACII
2011
Springer
12 years 7 months ago
A Psychologically-Inspired Match-Score Fusion Model for Video-Based Facial Expression Recognition
Communication between humans is rich in complexity and is not limited to verbal signals; emotions are conveyed with gesture, pose and facial expression. Facial Emotion Recognition ...
Albert Cruz, Bir Bhanu, Songfan Yang
TSD
2007
Springer
14 years 1 months ago
ECAF: Authoring Language for Embodied Conversational Agents
Abstract. Embodied Conversational Agent (ECA) is the user interface metaphor that allows to naturally communicate information during human-computer interaction in synergic modality...
Ladislav Kunc, Jan Kleindienst