We analyze a corpus of referring expressions collected from user interactions with a multimodal travel guide application. The analysis suggests that, in dramatic contrast to norma...
CT How can an adaptive intelligent interface decide what particular action to perform in a given situation, as a function of perceived properties of the user and the situation? Ide...
At the Human Computer Interaction Lab (HCILab) at UNC Charlotte, we investigate novel ways for people to interact with computers, and through computers with their environments. Ou...
Abstract. In this paper, we present a hand posture recognition system (configuration and position) we designed as part of a gestural man-machine interface. After a simple image pre...
RADAR is a multiagent system with a mixed-initiative user interface designed to help office workers cope with email overload. RADAR agents observe experts to learn models of their...
Aaron Steinfeld, Andrew Faulring, Asim Smailagic, ...