Sciweavers

CHI
2003
ACM

Hands on cooking: towards an attentive kitchen

14 years 12 months ago
Hands on cooking: towards an attentive kitchen
To make human computer interaction more transparent, different modes of communication need to be explored. We present eyeCOOK, a multimodal attentive cookbook to help a non-expert computer user cook a meal. The user communicates using eye-gaze and speech commands, and eyeCOOK responds visually and/or verbally, promoting communication through natural human input channels without physically encumbering the user. Our goal is to improve productivity and user satisfaction without creating additional requirements for user attention. We describe how the user interacts with the eyeCOOK prototype and the role of this system in an Attentive Kitchen. Keywords Attentive User Interfaces, Gaze, Eye Tracking, Speech, Context-aware, Information Appliance, Sensors.
Jeremy S. Bradbury, Jeffrey S. Shell, Craig B. Kno
Added 01 Dec 2009
Updated 01 Dec 2009
Type Conference
Year 2003
Where CHI
Authors Jeremy S. Bradbury, Jeffrey S. Shell, Craig B. Knowles
Comments (0)