Abstract—In this paper, we present the analysis and experimental validation of a vision-aided inertial navigation algorithm for planetary landing applications. The system employs tight integration of inertial and visual feature measurements to compute accurate estimates of the lander’s terrain-relative position, attitude, and velocity in real time. Two types of features are considered: mapped landmarks, i.e., features whose global 3D positions can be determined from a surface map, and opportunistic features, i.e., features that can be tracked in consecutive images, but whose 3D positions are not known. Both types of features are processed in an extended Kalman filter (EKF) estimator and are optimally fused with measurements from an inertial measurement unit (IMU). Results from a sounding rocket test, covering the dynamic profile of typical planetary landing scenarios, show estimation errors of magnitude 0.16 m/s in velocity and 6.4 m in position at touchdown. These results vastly...
Anastasios I. Mourikis, Nikolas Trawny, Stergios I