Abstract— This paper presents a tactile language for controlling a robot through its artificial skin. This language greatly improves the multimodal human-robot communication by ...
Adding expressive haptic feedback to mobile devices has great potential to improve their usability, particularly in multitasking situations where one’s visual attention is requi...
Rock Leung, Karon E. MacLean, Martin Bue Bertelsen...
We address the problem of automatic interpretation of nonexaggerated human facial and body behaviours captured in video. We illustrate our approach by three examples. (1) We intro...
This paper reports on automatic prediction of dialog acts and address types in three-party conversations. Dialogue acts and address types are predicted simultaneously on our frame...
In this paper we introduce a system that automatically adds different types of non-verbal behavior to a given dialogue script between two virtual embodied agents. It allows us to ...
Werner Breitfuss, Helmut Prendinger, Mitsuru Ishiz...
The rapid development of large interactive wall displays has been accompanied by research on methods that allow people to interact with the display at a distance. The basic method...
In the context of natural multimodal dialogue systems, we address the challenging issue of the definition of cooperative answers in an appropriate multimodal form. Highlighting th...
Meriam Horchani, Benjamin Caron, Laurence Nigay, F...
Automated analysis of human affective behavior has attracted increasing attention from researchers in psychology, computer science, linguistics, neuroscience, and related discipli...
Zhihong Zeng, Maja Pantic, Glenn I. Roisman, Thoma...