Volume XLII-2/W6
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-2/W6, 13-19, 2017
https://doi.org/10.5194/isprs-archives-XLII-2-W6-13-2017
© Author(s) 2017. This work is distributed under
the Creative Commons Attribution 4.0 License.
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-2/W6, 13-19, 2017
https://doi.org/10.5194/isprs-archives-XLII-2-W6-13-2017
© Author(s) 2017. This work is distributed under
the Creative Commons Attribution 4.0 License.

  23 Aug 2017

23 Aug 2017

IMPLEMENTATION OF A REAL-TIME STACKING ALGORITHM IN A PHOTOGRAMMETRIC DIGITAL CAMERA FOR UAVS

A. Audi1,2, M. Pierrot-Deseilligny1,3, C. Meynard1, and C. Thom1 A. Audi et al.
  • 1IGN, LaSTIG, LOEMI, 73 Avenue de Paris, 94160 Saint-Mandé, France
  • 2Université Paris-Est, 6-8 Avenue Blaise Pascal, 77420 Champs-sur-Marne, France
  • 3ENSG, 6-8 Avenue Blaise Pascal, 77420 Champs-sur-Marne, France

Keywords: UAVs, image stacking, real-time, hardware/Software co-design, FPGA, image processing

Abstract. In the recent years, unmanned aerial vehicles (UAVs) have become an interesting tool in aerial photography and photogrammetry activities. In this context, some applications (like cloudy sky surveys, narrow-spectral imagery and night-vision imagery) need a longexposure time where one of the main problems is the motion blur caused by the erratic camera movements during image acquisition. This paper describes an automatic real-time stacking algorithm which produces a high photogrammetric quality final composite image with an equivalent long-exposure time using several images acquired with short-exposure times.

Our method is inspired by feature-based image registration technique. The algorithm is implemented on the light-weight IGN camera, which has an IMU sensor and a SoC/FPGA. To obtain the correct parameters for the resampling of images, the presented method accurately estimates the geometrical relation between the first and the Nth image, taking into account the internal parameters and the distortion of the camera. Features are detected in the first image by the FAST detector, than homologous points on other images are obtained by template matching aided by the IMU sensors. The SoC/FPGA in the camera is used to speed up time-consuming parts of the algorithm such as features detection and images resampling in order to achieve a real-time performance as we want to write only the resulting final image to save bandwidth on the storage device. The paper includes a detailed description of the implemented algorithm, resource usage summary, resulting processing time, resulting images, as well as block diagrams of the described architecture. The resulting stacked image obtained on real surveys doesn’t seem visually impaired. Timing results demonstrate that our algorithm can be used in real-time since its processing time is less than the writing time of an image in the storage device. An interesting by-product of this algorithm is the 3D rotation estimated by a photogrammetric method between poses, which can be used to recalibrate in real-time the gyrometers of the IMU.