Sciweavers

971 search results - page 36 / 195
» Observing users in multimodal interaction
Sort
View
CHI
2007
ACM
14 years 9 months ago
Computer aided observations of complex mobile situations
Designing mobile and wearable applications is a challenge. The context of use is more important than ever and traditional methodologies for elicitation and specification reach the...
Tobias Klug
ISMAR
2003
IEEE
14 years 1 months ago
SenseShapes: Using Statistical Geometry for Object Selection in a Multimodal Augmented Reality System
We introduce a set of statistical geometric tools designed to identify the objects being manipulated through speech and gesture in a multimodal augmented reality system. SenseShap...
Alex Olwal, Hrvoje Benko, Steven Feiner
TASLP
2008
112views more  TASLP 2008»
13 years 8 months ago
A Study in Efficiency and Modality Usage in Multimodal Form Filling Systems
The usage patterns of speech and visual input modes are investigated as a function of relative input mode efficiency for both desktop and personal digital assistant (PDA) working ...
Manolis Perakakis, Alexandros Potamianos
CA
1997
IEEE
14 years 22 days ago
Layered Modular Action Control for Communicative Humanoids
Face-to-face interaction between people is generally effortless and effective. We exchange glances, take turns speaking and make facial and manual gestures to achieve the goals of ...
Kristinn R. Thórisson
ATAL
2010
Springer
13 years 9 months ago
How was your day?: a companion ECA
We demonstrate a "Companion" ECA, which is able to provide advice and support to the user, taking into account emotions expressed by her through dialogue. The integratio...
Marc Cavazza, Raul Santos de la Camara, Markku Tur...