Sciweavers

194 search results - page 38 / 39
» Multimodality and Gestures in the Teacher
Sort
View
CHI
2001
ACM
14 years 8 months ago
Listen reader: an electronically augmented paper-based book
While predictions abound that electronic books will supplant traditional paper-based books, many people bemoan the coming loss of the book as cultural artifact. In this project we...
Maribeth Back, Jonathan Cohen, Rich Gold, Steve R....
AIHC
2007
Springer
14 years 1 months ago
Modeling Naturalistic Affective States Via Facial, Vocal, and Bodily Expressions Recognition
Affective and human-centered computing have attracted a lot of attention during the past years, mainly due to the abundance of devices and environments able to exploit multimodal i...
Kostas Karpouzis, George Caridakis, Loïc Kess...
EUROSSC
2007
Springer
14 years 1 months ago
The Design of a Pressure Sensing Floor for Movement-Based Human Computer Interaction
This paper addresses the design of a large area, high resolution, networked pressure sensing floor with primary application in movement-based human-computer interaction (M-HCI). T...
Sankar Rangarajan, Assegid Kidané, Gang Qia...
MLMI
2005
Springer
14 years 1 months ago
The AMI Meeting Corpus: A Pre-announcement
Abstract. The AMI Meeting Corpus is a multi-modal data set consisting of 100 hours of meeting recordings. It is being created in the context of a project that is developing meeting...
Jean Carletta, Simone Ashby, Sebastien Bourban, Mi...
FGR
1998
IEEE
268views Biometrics» more  FGR 1998»
13 years 12 months ago
A Virtual Mirror Interface Using Real-Time Robust Face Tracking
We describe a virtual mirror interface which can react to people using robust, real-time face tracking. Our display can directly combine a user's face with various graphical ...
Trevor Darrell, Gaile G. Gordon, John Woodfill, Mi...