True multi-user, multimodal interaction over a digital table lets co-located people simultaneously gesture and speak commands to control an application. We explore this design spa...
Edward Tse, Saul Greenberg, Chia Shen, Clifton For...
Abstract—We describe a multimodal framework for interacting with an autonomous robotic forklift. A key element enabling effective interaction is a wireless, handheld tablet with ...
Andrew Correa, Matthew R. Walter, Luke Fletcher, J...
While sketches are commonly and effectively used in the early stages of design, some information is far more easily conveyed verbally than by sketching. In response, we have combi...
To make human computer interaction more transparent, different modes of communication need to be explored. We present eyeCOOK, a multimodal attentive cookbook to help a non-expert...
Jeremy S. Bradbury, Jeffrey S. Shell, Craig B. Kno...
Large sensor networks in applications such as surveillance and virtual classrooms, have to deal with the explosion of sensor information. Coherent presentation of data coming from...
Shichao Ou, Deepak R. Karuppiah, Andrew H. Fagg, E...