In this paper, we present a novel technique for estimating large camera displacement using stereo images. The relative transformation between two stereo image pairs is estimated using a hybrid registration algorithm which combines the robustness of multi-scale feature tracking for large movements and the accuracy of 3D normal flow constraints. Our hybrid technique takes advantage of depth information available from the stereo camera which makes it less sensitive to lighting variations. We tested the accuracy of our hybrid algorithm on real stereo sequences and showed that our technique handles displacements up to 150 cm and rotations up to 20 degrees between images. Our algorithm runs