Sciweavers

121 search results - page 8 / 25
» Augmenting user interfaces with adaptive speech commands
Sort
View
ACMDIS
2008
ACM
13 years 9 months ago
Exploring true multi-user multimodal interaction over a digital table
True multi-user, multimodal interaction over a digital table lets co-located people simultaneously gesture and speak commands to control an application. We explore this design spa...
Edward Tse, Saul Greenberg, Chia Shen, Clifton For...
HRI
2010
ACM
14 years 2 months ago
Multimodal interaction with an autonomous forklift
Abstract—We describe a multimodal framework for interacting with an autonomous robotic forklift. A key element enabling effective interaction is a wireless, handheld tablet with ...
Andrew Correa, Matthew R. Walter, Luke Fletcher, J...
IUI
2004
ACM
14 years 27 days ago
Speech and sketching for multimodal design
While sketches are commonly and effectively used in the early stages of design, some information is far more easily conveyed verbally than by sketching. In response, we have combi...
Aaron Adler, Randall Davis
CHI
2003
ACM
14 years 7 months ago
Hands on cooking: towards an attentive kitchen
To make human computer interaction more transparent, different modes of communication need to be explored. We present eyeCOOK, a multimodal attentive cookbook to help a non-expert...
Jeremy S. Bradbury, Jeffrey S. Shell, Craig B. Kno...
PERCOM
2004
ACM
14 years 7 months ago
An Augmented Virtual Reality Interface for Assistive Monitoring of Smart Spaces
Large sensor networks in applications such as surveillance and virtual classrooms, have to deal with the explosion of sensor information. Coherent presentation of data coming from...
Shichao Ou, Deepak R. Karuppiah, Andrew H. Fagg, E...