In this paper we present a simple but effective method for matching two uncalibrated images. Feature points are firstly extracted in each image using a fast multiscale corner detector. Each feature point is assigned with one dominant orientation. The correspondence of feature points is then established by utilizing a multilevel matching strategy. We employ the normalized cross-correlation defined as the similarity measure between two feature points in the matching procedure. The orientation of the correlation window is determined by the dominant orientation of the feature point to achieve rotation invariance. Experimental results on real images demonstrate that our method is effective for matching two images with large rotation and significant scale changes.