Sciweavers

89 search results - page 9 / 18
» Perceptual user interfaces using vision-based eye tracking
Sort
View
OZCHI
2006
ACM
14 years 1 months ago
LookPoint: an evaluation of eye input for hands-free switching of input devices between multiple computers
We present LookPoint, a system that uses eye input for switching input between multiple computing devices. LookPoint uses an eye tracker to detect which screen the user is looking...
Connor Dickie, Jamie Hart, Roel Vertegaal, Alex Ei...
CHI
2009
ACM
14 years 9 days ago
Disambiguating ninja cursors with eye gaze
Ninja cursors aim to speed up target selection on large or multiple monitors. Several cursors are displayed on the screen with one of them selected as the active cursor. Eye track...
Kari-Jouko Räihä, Oleg Spakov
CHI
2005
ACM
14 years 8 months ago
WebGazeAnalyzer: a system for capturing and analyzing web reading behavior using eye gaze
Capturing and analyzing the detailed eye movements of a user while reading a web page can reveal much about the ways in which web reading occurs. The WebGazeAnalyzer system descri...
David Beymer, Daniel M. Russell
CHI
2010
ACM
14 years 2 months ago
Knowing where and when to look in a time-critical multimodal dual task
Human-computer systems intended for time-critical multitasking need to be designed with an understanding of how humans can coordinate and interleave perceptual, memory, and motor ...
Anthony J. Hornof, Yunfeng Zhang, Tim Halverson
CHI
2005
ACM
14 years 8 months ago
Combining head tracking and mouse input for a GUI on multiple monitors
The use of multiple LCD monitors is becoming popular as prices are reduced, but this creates problems for window management and switching between applications. For a single monito...
Mark Ashdown, Kenji Oka, Yoichi Sato