Sciweavers

435 search results - page 21 / 87
» Interacting with the Computer Using Gaze Gestures
Sort
View
AIHC
2007
Springer
14 years 2 months ago
Gaze-X: Adaptive, Affective, Multimodal Interface for Single-User Office Scenarios
This paper describes an intelligent system that we developed to support affective multimodal human-computer interaction (AMM-HCI) where the user’s actions and emotions are modele...
Ludo Maat, Maja Pantic
CHI
2009
ACM
14 years 9 months ago
MicroRolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb
The input vocabulary for touch--screen interaction on handhelds is dramatically limited, especially when the thumb must be used. To enrich that vocabulary we propose to discrimina...
Anne Roudaut, Eric Lecolinet, Yves Guiard
MMSYS
2012
242views more  MMSYS 2012»
12 years 4 months ago
6DMG: a new 6D motion gesture database
Motion-based control is gaining popularity, and motion gestures form a complementary modality in human-computer interactions. To achieve more robust user-independent motion gestur...
Mingyu Chen, Ghassan Al-Regib, Biing-Hwang Juang
AUSAI
2005
Springer
14 years 2 months ago
Intelligent 3D Video Avatar for Immersive Telecommunication
Immersive telecommunication is a new challenging field that enables a user to share a virtual space with remote participants. The main objective is to offer rich communication moda...
Sang Yup Lee, Ig-Jae Kim, Sang Chul Ahn, Myo-Taeg ...
TSI
2010
13 years 3 months ago
Greta, une plateforme d'agent conversationnel expressif et interactif
This paper presents a generic ,modular and interactive architecture for embodied conversational agent called Greta. It is 3D agent able to communicate with users using verbal and n...
Etienne de Sevin, Radoslaw Niewiadomski, Elisabett...