Sciweavers

194 search results - page 7 / 39
» Multimodality and Gestures in the Teacher
Sort
View
AVI
2006
13 years 9 months ago
Enabling interaction with single user applications through speech and gestures on a multi-user tabletop
Co-located collaborators often work over physical tabletops with rich geospatial information. Previous research shows that people use gestures and speech as they interact with art...
Edward Tse, Chia Shen, Saul Greenberg, Clifton For...
HICSS
2002
IEEE
80views Biometrics» more  HICSS 2002»
14 years 20 days ago
Designing for Community: The Effects of Gender Representation in Videos on a Web Site
This paper analyzes a professional development Web site for teachers that features ‘virtual classroom visits’—video clips of teachers teaching, together with asynchronous fo...
Susan Herring, Anna Martinson, Rebecca Scheckler
ICCV
2005
IEEE
14 years 1 months ago
Multimodal Human Computer Interaction: A Survey
Abstract. In this paper we review the major approaches to multimodal human computer interaction from a computer vision perspective. In particular, we focus on body, gesture, gaze, ...
Alejandro Jaimes, Nicu Sebe
CVIU
2007
108views more  CVIU 2007»
13 years 7 months ago
Multimodal human-computer interaction: A survey
Abstract. In this paper we review the major approaches to multimodal human computer interaction from a computer vision perspective. In particular, we focus on body, gesture, gaze, ...
Alejandro Jaimes, Nicu Sebe
ICMCS
2006
IEEE
187views Multimedia» more  ICMCS 2006»
14 years 1 months ago
Combined Gesture-Speech Analysis and Speech Driven Gesture Synthesis
Multimodal speech and speaker modeling and recognition are widely accepted as vital aspects of state of the art human-machine interaction systems. While correlations between speec...
Mehmet Emre Sargin, Oya Aran, Alexey Karpov, Ferda...