Sciweavers

354 search results - page 25 / 71
» Multimodal human-computer interaction: A survey
Sort
View
HRI
2010
ACM
14 years 4 months ago
Multimodal interaction with an autonomous forklift
Abstract—We describe a multimodal framework for interacting with an autonomous robotic forklift. A key element enabling effective interaction is a wireless, handheld tablet with ...
Andrew Correa, Matthew R. Walter, Luke Fletcher, J...
INTERACT
2003
13 years 11 months ago
Usability Professionals' Personal Interest in Basic HCI theory
Abstract: This paper proposes a way to identify professional knowledge in a heterogeneous HCI (HumanComputer Interaction) community of usability professionals, designers and resear...
Torkil Clemmensen
ICRA
2009
IEEE
147views Robotics» more  ICRA 2009»
14 years 4 months ago
Generating Robot/Agent backchannels during a storytelling experiment
Abstract— This work presents the development of a realtime framework for the research of Multimodal Feedback of Robots/Talking Agents in the context of Human Robot Interaction (H...
Sames Al Moubayed, Malek Baklouti, Mohamed Chetoua...
CHI
2010
ACM
14 years 4 months ago
Knowing where and when to look in a time-critical multimodal dual task
Human-computer systems intended for time-critical multitasking need to be designed with an understanding of how humans can coordinate and interleave perceptual, memory, and motor ...
Anthony J. Hornof, Yunfeng Zhang, Tim Halverson
NORDICHI
2006
ACM
14 years 3 months ago
Tac-tiles: multimodal pie charts for visually impaired users
Tac-tiles is an accessible interface that allows visually impaired users to browse graphical information using tactile and audio feedback. The system uses a graphics tablet which ...
Steven A. Wall, Stephen A. Brewster