– In this paper, we describe a robot that interacts with humans in a crowded conference environment. The robot detects faces, determines the shirt color of onlooking conference attendants, and reacts with a combination of speech, musical, and movement responses. It continuously updates an internal emotional state, modeled realistically after human psychology research. Using empirically-determined mapping functions, the robot’s state in the emotion space is translated to a particular set of sound and movement responses. We successfully demonstrate this system at the AAAI ’05 Open Interaction Event, showing the potential for emotional modeling to improve human-robot interaction.
Geoffrey A. Hollinger, Yavor Georgiev, Anthony Man