In this paper, we report on our efforts in developing affective character-based interfaces, i.e. interfaces that recognize and measure affective information of the user and address user affect by employing embodied characters. In particular, we describe the Empathic Companion, an animated interface agent that accompanies the user in the setting of a virtual job interview. This interface application takes physiological data (skin conductance and electromyography) of a user in real-time, interprets them as emotions, and addresses the user's affective states in the form of empathic feedback. The Empathic Companion is conceived as an educational agent that supports job seekers in preparing for a job interview. We also present results from an exploratory study that aims to evaluate the impact of the Empathic Companion by measuring users' skin conductance and heart rate. While an overall positive effect of the Empathic Companion could not be shown, the outcome of the experiment su...