This paper describes an attempt to reveal the user’s intention from dialogue acts, thereby improving the effectiveness of natural interfaces to pedagogical agents. It focuses on cases where the intention is unclear from the dialogue context or utterance structure, but where the intention may still be identified using the emotional state of the user. The recognition of emotions is based on physiological user input. Our initial user study gave promising results that support our hypothesis that physiological evidence of emotions could be used to disambiguate dialogue acts. This paper presents our approach to the integration of natural language and emotions as well as our first empirical results, which may be used to endow interactive agents with emotional capabilities. Categories and Subject Descriptors H.5.2 [Information Systems]: Information Interfaces and Presentation—User Interfaces; I.2.7 [Computing Methodologies]: Artificial Intelligence—Natural Language Processing Genera...