Several methods for computing observer motion from monocular and stereo image sequences have been proposed. However, accurate positioning over long distances requires a higher level of robustness than previously achieved. This paper describes several mechanisms for improving robustness in the context of a maximum-likelihood stereo egomotion method. We demonstrate that even a robust system will accumulate super-linear error in the distance traveled due to increasing orientation errors. However, when an absolute orientation sensor is incorporated, the error growth is reduced to linear in the distance traveled, and grows much more slowly in practice. Our experiments, including a trial with 210 stereo pairs, indicate that these techniques can achieve errors below 1% of the distance traveled. This method has been implemented to run on-board a prototype Mars rover.
Clark F. Olson, Larry Matthies, Marcel Schoppers,