The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XLIII-B2-2020
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B2-2020, 111–118, 2020
https://doi.org/10.5194/isprs-archives-XLIII-B2-2020-111-2020
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B2-2020, 111–118, 2020
https://doi.org/10.5194/isprs-archives-XLIII-B2-2020-111-2020

  12 Aug 2020

12 Aug 2020

MINDFLOW BASED DENSE MATCHING BETWEEN TIR AND RGB IMAGES

J. Zhu1, Z. Ye1,2, Y. Xu1, L. Hoegner1, and U. Stilla1 J. Zhu et al.
  • 1Photogrammetry and Remote Sensing, Technische Universitaet Muenchen, 80290 Munich, Germany
  • 2College of Surveying and Geo-Informatics, Tongji University, 1239 Siping Road, Shanghai 200092, China

Keywords: TIR image, optical image, image registration, dense matching, MINDflow

Abstract. Image registration is a fundamental issue in photogrammetry and remote sensing, which targets to find the alignment between different images. Recently, registration of images from difference sensors become the hot topic. The registered images from different sensors are able to offer additional information, which help with different tasks like segmentation, classification, and even emergency analysis. In this paper, we proposed a registration strategy to calculate the dominant orientation difference and then achieve the dense alignment of Thermal Infrared (TIR) image and RGB image with MINDflow. Firstly, the orientation difference of TIR images and RGB images is calculated by finding the dominant image orientations based on phase congruency. Then, the modality independent neighborhood descriptor (MIND) together with global optical flow algorithm are adopted as MINDflow for dense matching. Our method is tested in the image sets containing TIR images and RGB images captured separately but in the same construction site areas. The results show that it is able to achieve the optimal results with features of significance even for dramatically radiometric differences between TIR images and RGB images. By comparing the results with other descriptor, our method is more robust and keep the features of objects in the images.