The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Articles | Volume XLII-1
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-1, 93–99, 2018
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-1, 93–99, 2018

  26 Sep 2018

26 Sep 2018


J. C. K. Chow1,2,3, I. Detchev4, K. D. Ang3,5, K. Morin6, K. Mahadevan7, and N. Louie3 J. C. K. Chow et al.
  • 1Department of Medicine, Cumming School of Medicine, University of Calgary, Calgary, Alberta, Canada
  • 2School of Earth and Planetary Sciences, Faculty of Science and Engineering, Curtin University, Perth, WA, Australia
  • 3Department of Research and Development, Vusion Technologies, Calgary, Alberta, Canada
  • 4Department of Geomatics Engineering, Schulich School of Engineering, University of Calgary, Calgary, Alberta, Canada
  • 5Department of Computer Science, Faculty of Science, University of Calgary, Calgary, Alberta, Canada
  • 6Leica Geosystems, Heerbrugg, Canton of St. Gallen, Switzerland
  • 7Department of Electrical and Computer Engineering, Faculty of Engineering, University of Alberta, Edmonton, Alberta, Canada

Keywords: Robot Vision, Omnidirectional Camera, Fisheye Lens, Mobile Robotics, UAV, Calibration, Machine Learning

Abstract. Visual perception is regularly used by humans and robots for navigation. By either implicitly or explicitly mapping the environment, ego-motion can be determined and a path of actions can be planned. The process of mapping and navigation are delicately intertwined; therefore, improving one can often lead to an improvement of the other. Both processes are sensitive to the interior orientation parameters of the camera system and mathematically modelling these systematic errors can often improve the precision and accuracy of the overall solution. This paper presents an automatic camera calibration method suitable for any lens, without having prior knowledge about the sensor. Statistical inference is performed to map the environment and localize the camera simultaneously. K-nearest neighbour regression is used to model the geometric distortions of the images. A normal-angle lens Nikon camera and wide-angle lens GoPro camera were calibrated using the proposed method, as well as the conventional bundle adjustment with self-calibration method (for comparison). Results showed that the mapping error was reduced from an average of 14.9 mm to 1.2 mm (i.e. a 92 % improvement) and 66.6 mm to 1.5 mm (i.e. a 98 % improvement) using the proposed method for the Nikon and GoPro cameras, respectively. In contrast, the conventional approach achieved an average 3D error of 0.9 mm (i.e. 94 % improvement) and 6 mm (i.e. 91 % improvement) for the Nikon and GoPro cameras, respectively. Thus, the proposed method performs more consistently, irrespective of the lens/sensor used: it yields results that are comparable to the conventional approach for normal-angle lens cameras, and it has the additional benefit of improving calibration results for wide-angle lens cameras.