Sciweavers

232 search results - page 4 / 47
» Modeling and Predicting Quality in Spoken Human-Computer Int...
Sort
View
HRI
2007
ACM
13 years 11 months ago
Elements of a spoken language programming interface for robots
In many settings, such as home care or mobile environments, demands on users' attention, or users' anticipated level of formal training, or other on-site conditions will...
Tim Miller, Andrew Exley, William Schuler
CHI
1995
ACM
13 years 11 months ago
Integrating multiple cues for spoken language understanding
As spoken language interfaces for real-world systems become a practical possibility, it has become apparent that such interfaces will need to draw on a variety of cues from divers...
Karen Ward, David G. Novick
HRI
2010
ACM
14 years 2 months ago
Robust spoken instruction understanding for HRI
—Natural human-robot interaction requires different and more robust models of language understanding (NLU) than non-embodied NLU systems. In particular, architectures are require...
Rehj Cantrell, Matthias Scheutz, Paul W. Schermerh...
CHI
2007
ACM
14 years 7 months ago
Keystroke-level model for advanced mobile phone interaction
The design of applications using mobile devices needs a different quality assessment than those known for desktop applications. Of the many aspects that have to be taken into acco...
Paul Holleis, Friederike Otto, Heinrich Hussmann, ...
INTETAIN
2005
Springer
14 years 1 months ago
Grounding Emotions in Human-Machine Conversational Systems
In this paper we investigate the role of user emotions in human-machine goal-oriented conversations. There has been a growing interest in predicting emotions from acted and non-act...
Giuseppe Riccardi, Dilek Z. Hakkani-Tür