The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XLIII-B2-2021
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B2-2021, 511–518, 2021
https://doi.org/10.5194/isprs-archives-XLIII-B2-2021-511-2021
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B2-2021, 511–518, 2021
https://doi.org/10.5194/isprs-archives-XLIII-B2-2021-511-2021

  28 Jun 2021

28 Jun 2021

REVIEW ON PHOTOGRAMMETRIC SURFACE INSPECTION IN AUTOMOTIVE PRODUCTION

M. Hödel, L. Hoegner, and U. Stilla M. Hödel et al.
  • Photogrammetry and Remote Sensing, Technical University of Munich, Germany

Keywords: Automotive, Surface Inspection, Surface Defect Detection, Photogrammetry, Machine Vision, Deep Learning

Abstract. When purchasing a premium car for a substantial sum, first impressions count. Key to that first impression is a flawless exterior appearance, something self-explanatory for the customer, but a far greater challenge for production than one might initially assume. Fortunately, photogrammetric technologies and evaluation methods are enabling an ever greater degree of oversight in the form of comprehensive quality data at different automotive production stages, namely stamping, welding, painting and finishing. A drawback lies in the challenging production environment, which complicates inline integratability of certain technologies. In recent years, machine vision and deep learning have been applied to photogrammetric surface inspection with ever increasing success. Given comprehensive surface quality information throughout the entire production chain, production parameters can be dialed in ever tighter in a data-driven fashion, leading to a sustainable increase in quality. This paper provides a review of current and potential contributions of photogrammetry to this end, discussing several recent advances in research along the way. Particular emphasis will be placed on early production stages, as well as the application of machine vision and deep learning to this challenging task. An outline for further research conducted by the authors will conclude this paper.