Sciweavers

54 search results - page 6 / 11
» Distributed speech processing in miPad's multimodal user int...
Sort
View
CICLING
2004
Springer
14 years 27 days ago
A Modal Logic Framework for Human-Computer Spoken Interaction
Abstract. One major goal of human computer interfaces is to simplify the communication task. Traditionally, users have been restricted to the language of computers for this task. W...
Luis Villaseñor Pineda, Manuel Montes-y-G&o...
ATAL
2009
Springer
14 years 2 months ago
Increasing the expressiveness of virtual agents: autonomous generation of speech and gesture for spatial description tasks
Embodied conversational agents are required to be able to express themselves convincingly and autonomously. Based on an empirial study on spatial descriptions of landmarks in dire...
Kirsten Bergmann, Stefan Kopp
CHI
2009
ACM
14 years 8 months ago
A biologically inspired approach to learning multimodal commands and feedback for human-robot interaction
In this paper we describe a method to enable a robot to learn how a user gives commands and feedback to it by speech, prosody and touch. We propose a biologically inspired approac...
Anja Austermann, Seiji Yamada
AUIC
2004
IEEE
13 years 11 months ago
Wearable Microphone Array as User Interface
We are at present enabled with machine-empowered technologies. The future is certainly looking towards human-empowered technologies, which should enable mobile user with natural w...
Yong Xu, Mingjiang Yang, Yanxin Yan, Jianfeng Chen
CHI
2003
ACM
14 years 7 months ago
XWand: UI for intelligent spaces
The XWand is a novel wireless sensor package that enables styles of natural interaction with intelligent environments. For example, a user may point the wand at a device and contr...
Andrew Wilson, Steven A. Shafer