The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Articles | Volume XLII-4/W18
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-4/W18, 331–334, 2019
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-4/W18, 331–334, 2019

  18 Oct 2019

18 Oct 2019


M. Eslami and M. Saadatseresht M. Eslami and M. Saadatseresht
  • School of Surveying and Geospatial Engineering, College of Engineering, University of Tehran, Tehran, Iran

Keywords: Fine Calibration, Point Cloud, Photogrammetric Images, Coarse to Fine Registration, Terrestrial Laser Scanner, Close-Range Imagery

Abstract. Laser scanner generated point cloud and photogrammetric imagery are complimentary data for many applications and services. Misalignment between imagery and point cloud data is a common problem, which causes to inaccurate products and procedures. In this paper, a novel strategy is proposed for coarse to fine registration between close-range imagery and terrestrial laser scanner point cloud data. In such a case, tie points are extracted and matched from photogrammetric imagery and preprocessing is applied on generated tie points to eliminate non-robust ones. At that time, for every tie point, two neighbor pixels are selected and matched in all overlapped images. After that, coarse interior orientation parameters (IOPs) and exterior orientation parameters (EOPs) of the images are employed to reconstruct object space points of the tie point and its two neighbor pixels. Then, corresponding nearest points to the object space photogrammetric points are estimated in the point cloud data. Estimated three point cloud points are used to calculate a plane and its normal vector. Theoretically, every object space tie point should be located on the aforementioned plane, which is used as conditional equation alongside the collinearity equation to fine register the photogrammetric imagery network. Attained root mean square error (RMSE) results on check points, have been shown less than 2.3 pixels, which shows the accuracy, completeness and robustness of the proposed method.