In this paper, we investigate the behavior of Gabor responses at automatically located facial feature points for face recognition. In our approach, a set of feature points on the facial landmarks is selected, where features of interest are eyes, nose, mouth and left and right contours of the face. As a preprocessing step all images in the database are normalized, eye positions in the normalized images are set at preset fixed coordinates, and integral projection is used for automatically locating facial landmarks such as nose and mouth areas in face images. Finally, Gabor responses at respective feature locations are extracted. The extracted features are analyzed for recognition performance using a neural network classifier with backpropagation learning, where input to the network is a similarity vector corresponding to feature points of two faces.