The most expressive way humans display emotions is through facial expressions. Humans detect and interpret faces and facial expressions in a scene with little or no effort. Still, development of an automated system that accomplishes this task is rather difficult. There are several related problems: detection of an image segment as a face, facial features extraction and tracking, extraction of the facial expression information, and classification of the expression (e.g., in emotion categories). In this paper, we present our fully integrated system which performs these operations accurately and in real time and represents a major step forward in our aim of achieving a humanlike interaction between the man and machine.