We are currently developing an adaptive, emotional, and expressive interface agent, which learns when and how to notify users about self-assigned tasks and events. In this paper, ...
We examine the utility of multiple types of turn-level and contextual linguistic features for automatically predicting student emotions in human-human spoken tutoring dialogues. W...
— Human centred services are increasingly common in the market of mobile devices. However, affective aware services are still scarce. In turn, the recognition of secondary emotio...
At the t2i Lab we focus on tangible user interfaces (TUIs) to advance and improve the user experience in computer supported learning and problem solving. By directly interacting w...
This paper describes recognition of emotions of an unkown person during natural walking. As gait data is redundant, high dimensional and variable, effective feature extraction is ...