Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XL-1/W2, 223-228, 2013
https://doi.org/10.5194/isprsarchives-XL-1-W2-223-2013
© Author(s) 2013. This work is distributed under
the Creative Commons Attribution 3.0 License.
 
16 Aug 2013
IMPROVED UAV-BORNE 3D MAPPING BY FUSING OPTICAL AND LASERSCANNER DATA
B. Jutzi1, M. Weinmann1, and J. Meidow2 1Institute of Photogrammetry and Remote Sensing, Karlsruhe Institute of Technology (KIT), Germany
2Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (IOSB), Germany
Keywords: UAV, multi-pulse laserscanning, sensor calibration, self-localization, data fusion Abstract. In this paper, a new method for fusing optical and laserscanner data is presented for improved UAV-borne 3D mapping. We propose to equip an unmanned aerial vehicle (UAV) with a small platform which includes two sensors: a standard low-cost digital camera and a lightweight Hokuyo UTM-30LX-EW laserscanning device (210 g without cable). Initially, a calibration is carried out for the utilized devices. This involves a geometric camera calibration and the estimation of the position and orientation offset between the two sensors by lever-arm and bore-sight calibration. Subsequently, a feature tracking is performed through the image sequence by considering extracted interest points as well as the projected 3D laser points. These 2D results are fused with the measured laser distances and fed into a bundle adjustment in order to obtain a Simultaneous Localization and Mapping (SLAM). It is demonstrated that an improvement in terms of precision for the pose estimation is derived by fusing optical and laserscanner data.
Conference paper (PDF, 1306 KB)


Citation: Jutzi, B., Weinmann, M., and Meidow, J.: IMPROVED UAV-BORNE 3D MAPPING BY FUSING OPTICAL AND LASERSCANNER DATA, Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XL-1/W2, 223-228, https://doi.org/10.5194/isprsarchives-XL-1-W2-223-2013, 2013.

BibTeX EndNote Reference Manager XML