Sciweavers

971 search results - page 32 / 195
» Observing users in multimodal interaction
Sort
View
CHI
2002
ACM
14 years 8 months ago
Evaluating look-to-talk: a gaze-aware interface in a collaborative environment
We present "look-to-talk", a gaze-aware interface for directing a spoken utterance to a software agent in a multiuser collaborative environment. Through a prototype and ...
Alice Oh, Harold Fox, Max Van Kleek, Aaron Adler, ...
HCI
2009
13 years 6 months ago
Multimodal Corpus Analysis as a Method for Ensuring Cultural Usability of Embodied Conversational Agents
In this paper we propose the method of multimodal corpus analysis to collect enough empirical data for modeling the behavior of embodied conversational agents. This is a prerequisi...
Yukiko I. Nakano, Matthias Rehm
ICMI
2005
Springer
143views Biometrics» more  ICMI 2005»
14 years 2 months ago
A look under the hood: design and development of the first SmartWeb system demonstrator
Experience shows that decisions in the early phases of the development of a multimodal system prevail throughout the life-cycle of a project. The distributed architecture and the ...
Norbert Reithinger, Simon Bergweiler, Ralf Engel, ...
CHI
2010
ACM
14 years 3 months ago
Crosstrainer: testing the use of multimodal interfaces in situ
We report the results of an exploratory 8-day field study of CrossTrainer: a mobile game with crossmodal audio and tactile feedback. Our research focuses on the longitudinal effec...
Eve E. Hoggan, Stephen A. Brewster
IADIS
2003
13 years 10 months ago
A Multimodal Interface for Digital Talking Books
This paper presents a framework for the production of digital talking books. These books target primarily the visually impaired community, but users with other characteristics can...
Carlos Duarte, Teresa Chambel, Luís Carri&c...