Sciweavers

194 search results - page 26 / 39
» Multimodality and Gestures in the Teacher
Sort
View
PDC
2006
ACM
14 years 1 months ago
A participatory design agenda for ubiquitous computing and multimodal interaction: a case study of dental practice
This paper reflects upon our attempts to bring a participatory design approach to design research into interfaces that better support dental practice. The project brought together...
Tim Cederman-Haysom, Margot Brereton
RIAO
2000
13 years 9 months ago
Multimodal Meeting Tracker
Face-to-face meetings usually encompass several modalities including speech, gesture, handwriting, and person identification. Recognition and integration of each of these modaliti...
Michael Bett, Ralph Gross, Hua Yu, Xiaojin Zhu, Yu...
BIOSTEC
2011
253views Healthcare» more  BIOSTEC 2011»
12 years 7 months ago
On the Benefits of Speech and Touch Interaction with Communication Services for Mobility Impaired Users
Although technology for communication has evolved tremendously over the past decades, mobility impaired individuals still face many difficulties interacting with communication serv...
Carlos Galinho Pires, Fernando Miguel Pinto, Eduar...
ICMI
2005
Springer
136views Biometrics» more  ICMI 2005»
14 years 1 months ago
Contextual recognition of head gestures
Head pose and gesture offer several key conversational grounding cues and are used extensively in face-to-face interaction among people. We investigate how dialog context from an ...
Louis-Philippe Morency, Candace L. Sidner, Christo...
ATAL
2009
Springer
14 years 2 months ago
Increasing the expressiveness of virtual agents: autonomous generation of speech and gesture for spatial description tasks
Embodied conversational agents are required to be able to express themselves convincingly and autonomously. Based on an empirial study on spatial descriptions of landmarks in dire...
Kirsten Bergmann, Stefan Kopp