This paper proposes an efficient and robust background compensation method for pan-tilt-zoom cameras. The proposed method approximates the relation between consecutive images to a three-parameter similarity transformation, which is separable in horizontal and vertical axes, and extracts and matches 1-D features that are local minima and maxima of intensity projection profiles in each axis. These correspondences are used to estimate transformation parameters via an outlier rejection approach that is efficient in cases where there are a large percentage of outliers. Experimental results show that the proposed method is more robust with respect to blurring effects and moving object proportion (how much area is occupied by moving objects) while dramatically decreasing computational costs compared to previous methods.