Sciweavers

587 search results - page 16 / 118
» A Gesture Interface for Human-Robot-Interaction
Sort
View
CHI
2003
ACM
14 years 2 months ago
Developing a car gesture interface for use as a secondary task
Existing gesture-interface research has centered on controlling the user’s primary task. This paper explores the use of gestures to control secondary tasks while user is focused...
Micah Alpern, Katie Minardo
CHI
2008
ACM
14 years 10 months ago
The see-Puck: a platform for exploring human-robot relationships
We present the see-Puck, a round display module that extends an open robot platform, the e-Puck. It holds 148 LEDs (light emitting diodes) to enable the presentation of eye-catchi...
Mattias Jacobsson, Johan Bodin, Lars Erik Holmquis...
ICMI
2004
Springer
116views Biometrics» more  ICMI 2004»
14 years 3 months ago
Towards integrated microplanning of language and iconic gesture for multimodal output
When talking about spatial domains, humans frequently accompany their explanations with iconic gestures to depict what they are referring to. For example, when giving directions, ...
Stefan Kopp, Paul Tepper, Justine Cassell
NIME
2005
Springer
120views Music» more  NIME 2005»
14 years 3 months ago
Scrubber: An Interface for Friction-induced Sounds
The Scrubber is a general controller for friction-induced sound. Allowing the user to engage in familiar gestures and feeling actual friction, the synthesized sound gains an evoca...
Georg Essl, M. Sile O'Modhrain
CHI
2009
ACM
14 years 10 months ago
How do people talk with a robot?: an analysis of human-robot dialogues in the real world
This paper reports the preliminary results of a humanrobot dialogue analysis in the real world with the goal of understanding users' interaction patterns. We analyzed the dia...
Min Kyung Lee, Maxim Makatchev