Sciweavers

IROS
2009
IEEE

Self-location from monocular uncalibrated vision using reference omniviews

14 years 7 months ago
Self-location from monocular uncalibrated vision using reference omniviews
— In this paper we present a novel approach to perform indoor self-localization using reference omnidirectional images. We only need one omnidirectional image of the whole scene stored in the robot memory and a conventional uncalibrated on-board camera. We match the omnidirectional image and the conventional images captured by the on-board camera and compute the hybrid epipolar geometry using lifted coordinates and robust techniques. We map the epipole in the reference omnidirectional image to a ground plane through a homography in lifted coordinates also, giving the position of the robot in the planar ground, and its uncertainty. We perform experiments with simulated and real data to show the feasibility of this new self-localization approach.
Luis Puig, José Jesús Guerrero
Added 24 May 2010
Updated 24 May 2010
Type Conference
Year 2009
Where IROS
Authors Luis Puig, José Jesús Guerrero
Comments (0)