The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XLIII-B2-2021
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B2-2021, 399–404, 2021
https://doi.org/10.5194/isprs-archives-XLIII-B2-2021-399-2021
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B2-2021, 399–404, 2021
https://doi.org/10.5194/isprs-archives-XLIII-B2-2021-399-2021

  28 Jun 2021

28 Jun 2021

REAL-TIME SEMANTIC SLAM WITH DCNN-BASED FEATURE POINT DETECTION, MATCHING AND DENSE POINT CLOUD AGGREGATION

B. Vishnyakov, I. Sgibnev, V. Sheverdin, A. Sorokin, P. Masalov, K. Kazakhmedov, and S. Arseev B. Vishnyakov et al.
  • FGUP «State Research Institute of Aviation Systems», 7, Viktorenko Street, Moscow, 125319, Russia

Keywords: multi-sensor platform, autonomous vehicle, SLAM, CNN, dynamic scene analysis, semantic segmentation, data fusion, dynamic scene reconstruction, dense point cloud, feature point, stereo disparity estimation

Abstract. In this paper we present the semantic SLAM method based on a bundle of deep convolutional neural networks. It provides real-time dense semantic scene reconstruction for the autonomous driving system of an off-road robotic vehicle. Most state-of-the-art neural networks require large computing resources that go beyond the capabilities of many robotic platforms. We propose an architecture for 3D semantic scene reconstruction on top of the recent progress in computer vision by integrating SuperPoint, SuperGlue, Bi3D, DeepLabV3+, RTM3D and additional module with pre-processing, inference and postprocessing operations performed on GPU. We also updated our simulated dataset for semantic segmentation and added disparity images.