Sciweavers

IUI
2005
ACM

A framework for designing intelligent task-oriented augmented reality user interfaces

14 years 6 months ago
A framework for designing intelligent task-oriented augmented reality user interfaces
A task-oriented space can benefit from an augmented reality interface that layers the existing tools and surfaces with useful information to make cooking more easy, safe and efficient. To serve experienced users as well as novices, augmented reality interfaces need to adapt modalities to the user’s expertise and allow for multiple ways to perform tasks. We present a framework for designing an intelligent user interface that informs and choreographs multiple tasks in a single space according to a model of tasks and users. A residential kitchen has been outfitted with systems to gather data from tools and surfaces and project multi-modal interfaces back onto the tools and surfaces themselves. Based on user evaluations of this augmented reality kitchen, we propose a system to tailor information modalities based on the spatial and temporal qualities of the task, and the expertise, location and progress of the user. The intelligent augmented reality user interface choreographs multiple t...
Leonardo Bonanni, Chia-Hsun Lee, Ted Selker
Added 26 Jun 2010
Updated 26 Jun 2010
Type Conference
Year 2005
Where IUI
Authors Leonardo Bonanni, Chia-Hsun Lee, Ted Selker
Comments (0)