EmotionFace is a software interface for visually displaying the self-reported emotion expressed by music. Taken in reverse, it can be viewed as a facial expression whose auditory connection or exemplar is the time synchronized, associated music. The present instantiation of the software uses a simple schematic face with eyes and mouth moving according to a parabolic model: Smiling and frowning of mouth represents valence (happiness and sadness) and amount of opening of eyes represents arousal. Continuous emotional responses to music collected in previous research have been used to test and calibrate EmotionFace. The interface provides an alternative to the presentation of data on a two-dimensional emotion-space, the same space used for the collection of emotional data in response to music. These synthesized facial expressions make the observation of the emotion data expressed by music easier for the human observer to process and may be a more natural interface between the human and co...