Sciweavers

3115 search results - page 127 / 623
» Interactive Speech Understanding
Sort
View
ISER
2004
Springer
142views Robotics» more  ISER 2004»
15 years 9 months ago
Interactive Multi-Modal Robot Programming
As robots enter the human environment and come in contact with inexperienced users, they need to be able to interact with users in a multi-modal fashion—keyboard and mouse are n...
Soshi Iba, Christiaan J. J. Paredis, Pradeep K. Kh...
HCI
2009
15 years 2 months ago
Using 3D Touch Interaction for a Multimodal Zoomable User Interface
Touchscreens are becoming the preferred input device in a growing number of applications. They are interesting devices which are more and more introduced into the automotive domain...
Florian Laquai, Markus Ablaßmeier, Tony Poit...
TMM
2011
177views more  TMM 2011»
14 years 11 months ago
MIMiC: Multimodal Interactive Motion Controller
Abstract—We introduce a new algorithm for real-time interactive motion control and demonstrate its application to motion captured data, pre-recorded videos and HCI. Firstly, a da...
Dumebi Okwechime, Eng-Jon Ong, Richard Bowden
CVPR
2007
IEEE
16 years 6 months ago
DigiTable: an interactive multiuser table for collocated and remote collaboration enabling remote gesture visualization
We present DIGITABLE, an experimental platform we hope lessen the gap between co-present and distant interaction. DIGITABLE is combining a multiuser tactile interactive tabletop, ...
Francois Coldefy, Stéphane Louis Dit Picard
NIPS
2007
15 years 5 months ago
EEG-Based Brain-Computer Interaction: Improved Accuracy by Automatic Single-Trial Error Detection
Brain-computer interfaces (BCIs), as any other interaction modality based on physiological signals and body channels (e.g., muscular activity, speech and gestures), are prone to e...
Pierre W. Ferrez, José del R. Millán