The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Articles | Volume XLIII-B1-2020
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B1-2020, 549–555, 2020
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B1-2020, 549–555, 2020
06 Aug 2020
06 Aug 2020


A. Masiero1,2, H. Perakis3, J. Gabela4, C. Toth5, V. Gikas3, G. Retscher6, S. Goel7, A. Kealy8, Z. Koppányi9, W. Błaszczak-Bak10, Y. Li11, and D. Grejner-Brzezinska12 A. Masiero et al.
  • 1Dept. of Civil and Environmental Engineering, University of Florence, Italy
  • 2Interdept. Research Center of Geomatics (CIRGEO), University of Padova, Italy
  • 3School of Rural and Surveying Engineering, National Technical University of Athens, Athens, Greece
  • 4Department of Electrical and Electronic Engineering, The University of Melbourne, Australia
  • 5Department of Civil, Environmental and Geodetic Engineering, The Ohio State University, Columbus, Ohio, USA
  • 6Department of Geodesy and Geoinformation, TU Wien-Vienna University of Technology, Vienna, Austria
  • 7Department of Civil Engineering, Indian Institute of Technology, Kanpur, India
  • 8Department of Geospatial Science, RMIT University, Melbourne, Australia
  • 9Leica Geosystems, Heerbrugg, Switzerland
  • 10Institute of Geodesy of the University of Warmia and Mazury, Olsztyn, Poland
  • 11SMART Infrastructure Facility, University of Wollongong, Australia
  • 12College of Engineering, The Ohio State University, Columbus, Ohio, USA

Keywords: Indoor positioning, Sensor fusion, UWB, SLAM, Deep Learning

Abstract. The increasing demand for reliable indoor navigation systems is leading the research community to investigate various approaches to obtain effective solutions usable with mobile devices. Among the recently proposed strategies, Ultra-Wide Band (UWB) positioning systems are worth to be mentioned because of their good performance in a wide range of operating conditions. However, such performance can be significantly degraded by large UWB range errors; mostly, due to non-line-of-sight (NLOS) measurements. This paper considers the integration of UWB with vision to support navigation and mapping applications. In particular, this work compares positioning results obtained with a simultaneous localization and mapping (SLAM) algorithm, exploiting a standard and a Time-of-Flight (ToF) camera, with those obtained with UWB, and then with the integration of UWB and vision. For the latter, a deep learning-based recognition approach was developed to detect UWB devices in camera frames. Such information is both introduced in the navigation algorithm and used to detect NLOS UWB measurements. The integration of this information allowed a 20% positioning error reduction in this case study.