Sciweavers

GI
2009
Springer

A Framework for Multiple Radar and Multiple 2D/3D Camera Fusion

13 years 9 months ago
A Framework for Multiple Radar and Multiple 2D/3D Camera Fusion
Abstract: In this paper we present a framework for the fusion of radar and image information. In the case considered here we combine information from multiple closerange radars to one fused radar measurement using the overlap region of the individual radars. This step is performed automatically using a feature based matching technique. Additionally, we use multiple 2D/3D cameras that generate (color) image and distance information. We show how to fuse these heterogeneous sensors in the context of airport runway surveillance. A possible application of this is the automatic detection of midget objects (e.g. screws) on airfields. We outline how to generate an adaptive background model for the situation on the runway from the fused sensor information. Unwanted objects on the airfield can then be detected by change detection.
Marek Schikora, Benedikt Romba
Added 17 Feb 2011
Updated 17 Feb 2011
Type Journal
Year 2009
Where GI
Authors Marek Schikora, Benedikt Romba
Comments (0)