Sciweavers

62 search results - page 8 / 13
» A Salience-Driven Approach to Speech Recognition for Human-R...
Sort
View
NIPS
2007
13 years 9 months ago
EEG-Based Brain-Computer Interaction: Improved Accuracy by Automatic Single-Trial Error Detection
Brain-computer interfaces (BCIs), as any other interaction modality based on physiological signals and body channels (e.g., muscular activity, speech and gestures), are prone to e...
Pierre W. Ferrez, José del R. Millán
INTERACT
2003
13 years 9 months ago
Designing and Prototyping Multimodal Commands
Abstract: Designing and implementing multimodal applications that take advantage of several recognitionbased interaction techniques (e.g. speech and gesture recognition) is a diffi...
Marie-Luce Bourguet
HRI
2009
ACM
14 years 2 months ago
An affective guide robot in a shopping mall
To explore possible robot tasks in daily life, we developed a guide robot for a shopping mall and conducted a field trial with it. The robot was designed to interact naturally wit...
Takayuki Kanda, Masahiro Shiomi, Zenta Miyashita, ...
SIGIR
2008
ACM
13 years 7 months ago
Spoken content retrieval: Searching spontaneous conversational speech
The second workshop on Searching Spontaneous Conversational Speech (SSCS 2008) was held in Singapore on July 24, 2008 in conjunction with the 31st Annual International ACM SIGIR C...
Joachim Köhler, Martha Larson, Franciska de J...
ISER
2004
Springer
142views Robotics» more  ISER 2004»
14 years 27 days ago
Interactive Multi-Modal Robot Programming
As robots enter the human environment and come in contact with inexperienced users, they need to be able to interact with users in a multi-modal fashion—keyboard and mouse are n...
Soshi Iba, Christiaan J. J. Paredis, Pradeep K. Kh...