— In this paper we present a novel approach to robust visual servoing. This method removes the feature tracking step from a typical visual servoing algorithm. We do not need correspondences of the features for deriving the control signal. This is achieved by modeling the image features as a Mixture of Gaussians in the current as well as desired images. Using Lyapunov theory, a control signal is derived to minimize a distance function between the two Gaussian mixtures. The distance function is given in a closed form, and its gradient can be efficiently computed and used to control the system. For simplicity, we first consider the 2D motion case. Then, the general case is presented by introducing the depth distribution of the features to control the six degrees of freedom. Experiments are conducted within a simulation framework to validate our proposed method.
A. H. Abdul Hafez, Supreeth Achar, C. V. Jawahar