Sciweavers

CHI
2007
ACM

EyePoint: practical pointing and selection using gaze and keyboard

14 years 12 months ago
EyePoint: practical pointing and selection using gaze and keyboard
We present a practical technique for pointing and selection using a combination of eye gaze and keyboard triggers. EyePoint uses a two-step progressive refinement process fluidly stitched together in a look-press-look-release action, which makes it possible to compensate for the accuracy limitations of the current state-of-the-art eye gaze trackers. While research in gaze-based pointing has traditionally focused on disabled users, EyePoint makes gaze-based pointing effective and simple enough for even able-bodied users to use for their everyday computing tasks. As the cost of eye gaze tracking devices decreases, it will become possible for such gaze-based techniques to be used as a viable alternative for users who choose not to use a mouse depending on their abilities, tasks and preferences. Author Keywords Pointing and Selection, Eye Pointing, Eye Tracking, Gazeenhanced User Interface Design. ACM Classification Keywords H5.2. User Interfaces: Input devices and strategies.
Manu Kumar, Andreas Paepcke, Terry Winograd
Added 30 Nov 2009
Updated 30 Nov 2009
Type Conference
Year 2007
Where CHI
Authors Manu Kumar, Andreas Paepcke, Terry Winograd
Comments (0)