Sciweavers

247 search results - page 38 / 50
» Multimodal expression in virtual humans
Sort
View
ACMACE
2007
ACM
14 years 1 months ago
An experimental setting to measure contextual perception of embodied conversational agents
We introduce an experimental setting to observe and measure the perception of facial expression performed by embodied conversational agents (ECAs). The experimental set-up enables...
Michael Lankes, Regina Bernhaupt, Manfred Tschelig...
VRST
2000
ACM
14 years 2 months ago
Animated deformations with radial basis functions
We present a novel approach to creating deformations of polygonal models using Radial Basis Functions (RBFs) to produce localized real-time deformations. Radial Basis Functions as...
Jun-yong Noh, Douglas Fidaleo, Ulrich Neumann
HAPTICS
2005
IEEE
14 years 3 months ago
Effect of Cognitive Load on Tactor Location Identification in Zero-g
A wearable haptic interface has been developed to impart vibrotactile information to its user with the goal of improving situation awareness. The effectiveness of the haptic inter...
Anu Bhargava, Michael Scott, Ryan Traylor, Roy Chu...
ACII
2005
Springer
14 years 3 months ago
An Adaptive Personality Model for ECAs
Curtin University’s Talking Heads (TH) combine an MPEG-4 compliant Facial Animation Engine (FAE), an Text To Emotional Speech Synthesiser (TTES), a multi-modal Dialogue Manager (...
He Xiao, Donald Reid, Andrew Marriott, E. K. Gulla...
CW
2003
IEEE
14 years 3 months ago
Disappearing Computers, Social Actors and Embodied Agents
Presently, there are user interfaces that allow multimodal interactions. Many existing research and prototype systems introduced embodied agents, assuming that they allow a more n...
Anton Nijholt