Sciweavers

194 search results - page 14 / 39
» Multimodality and Gestures in the Teacher
Sort
View
CHI
2004
ACM
14 years 8 months ago
ICARE: a component-based approach for the design and development of multimodal interfaces
Multimodal interactive systems support multiple interaction techniques such as the synergistic use of speech, gesture and eye gaze tracking. The flexibility they offer results in ...
Jullien Bouchet, Laurence Nigay
ACMDIS
2008
ACM
13 years 9 months ago
Exploring true multi-user multimodal interaction over a digital table
True multi-user, multimodal interaction over a digital table lets co-located people simultaneously gesture and speak commands to control an application. We explore this design spa...
Edward Tse, Saul Greenberg, Chia Shen, Clifton For...
COLING
2000
13 years 9 months ago
Finite-state Multimodal Parsing and Understanding
Multimodal interfaces require effective parsing and nn(lerstanding of utterances whose content is distributed across multiple input modes. Johnston 1998 presents an approach in wh...
Michael Johnston, Srinivas Bangalore
ICMI
2004
Springer
152views Biometrics» more  ICMI 2004»
14 years 1 months ago
Exploiting prosodic structuring of coverbal gesticulation
Although gesture recognition has been studied extensively, communicative, affective, and biometrical “utility” of natural gesticulation remains relatively unexplored. One of t...
Sanshzar Kettebekov
MM
2003
ACM
126views Multimedia» more  MM 2003»
14 years 1 months ago
DOVE: drawing over video environment
We demonstrate a multimedia system that integrates pen-based gesture and live video to support collaboration on physical tasks. The system combines network IP cameras, desktop PCs...
Jiazhi Ou, Xilin Chen, Susan R. Fussell, Jie Yang