Sciweavers

395 search results - page 12 / 79
» When do we interact multimodally
Sort
View
AIHC
2007
Springer
14 years 1 months ago
Gaze-X: Adaptive, Affective, Multimodal Interface for Single-User Office Scenarios
This paper describes an intelligent system that we developed to support affective multimodal human-computer interaction (AMM-HCI) where the user’s actions and emotions are modele...
Ludo Maat, Maja Pantic
INTERACT
2003
13 years 8 months ago
Facial Orientation During Multi-party Interaction with Information Kiosks
: We hypothesize that the performance of multimodal perceptive user interfaces during multi-party interaction may be improved by using facial orientation of users as a cue for iden...
Ilse Bakx, Koen van Turnhout, Jacques M. B. Terken
VR
2000
IEEE
101views Virtual Reality» more  VR 2000»
13 years 12 months ago
Multimodal Menu Presentation and Selection in Immersive Virtual Environments
Usability has become one of the key ingredients in making virtual reality (VR) systems work, and a big part of a usable VR system is in the design of effective interface/interacti...
Namgyu Kim, Gerard Jounghyun Kim, Chan-Mo Park, In...
CHI
2011
ACM
12 years 11 months ago
What did i miss?: in-meeting review using multimodal accelerated instant replay (air) conferencing
People sometimes miss small parts of meetings and need to quickly catch up without disrupting the rest of the meeting. We developed an Accelerated Instant Replay (AIR) Conferencin...
Sasa Junuzovic, Kori Inkpen, Rajesh Hegde, Zhengyo...
OZCHI
2009
ACM
14 years 2 months ago
Doing things backwards: the OWL project
The OWL project is inspired by Arthur C. Clarke's Third Law of Technology Prediction: Any sufficiently advanced technology is indistinguishable from magic. It consists of a s...
Danielle Wilde, Kristina Andersen