Sciweavers

971 search results - page 21 / 195
» Observing users in multimodal interaction
Sort
View
COLING
2000
13 years 9 months ago
Taking Account of the User's View in 3D Multimodal Instruction Dialogue
While recent advancements in virtual reality technology have created a rich communication interface linking humans and computers, there has been little work on building dialogue s...
Yukiko I. Nakano, Kenji Imamura, Hisashi Ohara
ICMI
2009
Springer
162views Biometrics» more  ICMI 2009»
14 years 3 months ago
Multi-modal features for real-time detection of human-robot interaction categories
Social interactions unfold over time, at multiple time scales, and can be observed through multiple sensory modalities. In this paper, we propose a machine learning framework for ...
Ian R. Fasel, Masahiro Shiomi, Pilippe-Emmanuel Ch...
ICMI
2004
Springer
142views Biometrics» more  ICMI 2004»
14 years 1 months ago
Multimodal interaction for distributed collaboration
We demonstrate a same-time different-place collaboration system for managing crisis situations using geospatial information. Our system enables distributed spatial decision-making...
Levent Bolelli, Guoray Cai, Hongmei Wang, Bita Mor...
DPPI
2003
ACM
14 years 1 months ago
Observing and probing
In this paper, we discuss and compare two user centred methods applied in concept design: observation and probes. The comparison is based on findings from two case studies. In the...
Vesa Jääskö, Tuuli Mattelmäki
CHI
2003
ACM
14 years 8 months ago
Hands on cooking: towards an attentive kitchen
To make human computer interaction more transparent, different modes of communication need to be explored. We present eyeCOOK, a multimodal attentive cookbook to help a non-expert...
Jeremy S. Bradbury, Jeffrey S. Shell, Craig B. Kno...