Most of vision-based algorithms for motion and localization estimation requires matching some interest points in a pair of images. After building feature correspondence, it is possible to estimate camera motion/localization using epipolar geometry. However feature matching is still a challenging problem because of time constraint or image variability for example. In several robotic applications, the camera rotation may be known thanks to a gyroscope or another orientation sensor. Therefore, in this paper, we aim to answer the following question: can the knowledge of rotation from a gyroscope be used to improve feature matching. To analyze this new approach of camera and gyroscope data fusion, we proceed in two steps. First, we rotationally align the images using rotation information of the gyroscope. And second, we compare the quality of feature matching in the original and rotationally aligned images. Experimental results on a real catadioptric sequence show that gyroscope data permi...