In this paper we present an approach for lip motion analysis that can be used in conjunction with a person authentication system based on face recognition, to avoid attacks on the system using passive photographs. This work focuses on robustly tracking lips in gray scale images, which may be captured in the visible light or near infrared spectrum. We present an approach for locating the two lip corners in a face image. Then we extract suitable features from the mouth region to classify mouth states (visemes). The system shows a classification accuracy of above 85%. The temporal changes in the detected viseme classes can be used for detecting the imposter.