Sciweavers

304 search results - page 29 / 61
» A Multi-Context System Computing Modalities
Sort
View
CHI
2004
ACM
14 years 8 months ago
ICARE: a component-based approach for the design and development of multimodal interfaces
Multimodal interactive systems support multiple interaction techniques such as the synergistic use of speech, gesture and eye gaze tracking. The flexibility they offer results in ...
Jullien Bouchet, Laurence Nigay
ICPR
2010
IEEE
13 years 11 months ago
Crossmodal Matching of Speakers Using Lip and Voice Features in Temporally Non-Overlapping Audio and Video Streams
Person identification using audio (speech) and visual (facial appearance, static or dynamic) modalities, either independently or jointly, is a thoroughly investigated problem in pa...
Anindya Roy, Sebastien Marcel
MM
2005
ACM
139views Multimedia» more  MM 2005»
14 years 1 months ago
Multimodal affect recognition in learning environments
We propose a multi-sensor affect recognition system and evaluate it on the challenging task of classifying interest (or disinterest) in children trying to solve an educational pu...
Ashish Kapoor, Rosalind W. Picard
PUC
2010
205views more  PUC 2010»
13 years 2 months ago
A database-based framework for gesture recognition
Abstract Gestures are an important modality for human-machine communication. Computer vision modules performing gesture recognition can be important components of intelligent homes...
Vassilis Athitsos, Haijing Wang, Alexandra Stefan
CGF
2006
196views more  CGF 2006»
13 years 7 months ago
Physically Based Deformable Models in Computer Graphics
Physically based deformable models have been widely embraced by the Computer Graphics community. Many problems outlined in a previous survey by Gibson and Mirtich [GM97] have been...
Andrew Nealen, Matthias Müller, Richard Keise...