This paper addresses the problem of discovering conversational group dynamics from nonverbal cues extracted from thin-slices of interaction. We first propose and analyze a novel t...
Affect sensitivity is of the utmost importance for a robot companion to be able to display socially intelligent behaviour, a key requirement for sustaining long-term interactions ...
During multiparty meetings, participants can use non-verbal modalities such as hand gestures to make reference to the shared environment. Therefore, one hypothesis is that incorpo...
This article proposes an evaluation framework to benchmark the performance of multimodal fusion engines. The paper first introduces different concepts and techniques associated wi...
This article introduces HephaisTK, a toolkit for rapid prototyping of multimodal interfaces. After briefly discussing the state of the art, the architecture traits of the toolkit ...
In a typical group meeting involving discussion and collaboration, people look at one another, at shared information resources such as presentation material, and also at nothing i...
Multimodal interfaces combining natural modalities such as speech and touch with dynamic graphical user interfaces can make it easier and more effective for users to interact wit...