Sciweavers

89 search results - page 12 / 18
» Perceptual user interfaces using vision-based eye tracking
Sort
View
CHI
2007
ACM
14 years 8 months ago
GUIDe: gaze-enhanced UI design
The GUIDe (Gaze-enhanced User Interface Design) project in the HCI Group at Stanford University explores how gaze information can be effectively used as an augmented input in addi...
Manu Kumar, Terry Winograd
IUI
2006
ACM
14 years 1 months ago
Head gesture recognition in intelligent interfaces: the role of context in improving recognition
Acknowledging an interruption with a nod of the head is a natural and intuitive communication gesture which can be performed without significantly disturbing a primary interface ...
Louis-Philippe Morency, Trevor Darrell
ICMI
2005
Springer
170views Biometrics» more  ICMI 2005»
14 years 1 months ago
Inferring body pose using speech content
Untethered multimodal interfaces are more attractive than tethered ones because they are more natural and expressive for interaction. Such interfaces usually require robust vision...
Sy Bor Wang, David Demirdjian
CHI
2004
ACM
14 years 8 months ago
ICARE: a component-based approach for the design and development of multimodal interfaces
Multimodal interactive systems support multiple interaction techniques such as the synergistic use of speech, gesture and eye gaze tracking. The flexibility they offer results in ...
Jullien Bouchet, Laurence Nigay
CHI
2010
ACM
14 years 2 months ago
Input precision for gaze-based graphical passwords
Click-based graphical passwords have been proposed as alternatives to text-based passwords, despite being potentially vulnerable to shoulder-surfing, where an attacker can learn p...
Alain Forget, Sonia Chiasson, Robert Biddle