The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XLIII-B1-2022
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B1-2022, 235–240, 2022
https://doi.org/10.5194/isprs-archives-XLIII-B1-2022-235-2022
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B1-2022, 235–240, 2022
https://doi.org/10.5194/isprs-archives-XLIII-B1-2022-235-2022
 
30 May 2022
30 May 2022

A LOW-COST VISUAL RADAR-BASED ODOMETRY FRAMEWORK WITH MMWAVE RADAR AND MONOCULAR CAMERA

Y.-E. Lu1, S. Tsai1, M.-L. Tsai2, and K.-W. Chiang1 Y.-E. Lu et al.
  • 1Dept. of Geomatics, National Cheng Kung University, Tainan, Taiwan
  • 2High Definition Map Research Center, Dept. of Geomatics, National Cheng Kung University, Tainan, Taiwan

Keywords: Visual Odometry, Radar Odometry, Millimeter Wave Radar, Simultaneous Localization and Mapping, Positioning

Abstract. One of the most popular research areas is low-cost navigation and positioning systems for autonomous vehicles. Determining a vehicle's position within a lane is critical for achieving high automation. Vehicle navigation and positioning relied heavily on the Global Navigation Satellite System (GNSS) service in open-sky scenarios. Nonetheless, GNSS signals were easily degraded due to various environmental situations such as urban canyons caused by multi-path effects and Non-Line-of-Sight (NLOS) issues. To perform robustly in complex scenarios, sensor fusion is the most common solution. The following paper presents a radar visual odometry framework to improve the lack of scale factors for monocular cameras and poor angular resolution for radar. The framework is based on the characteristics of camera and radar sensors which have complementary advantages in each other. The results show that the proposed framework can be used to estimate general 2D motion in an indoor environment and correct the unknown scale factor of Monocular Visual Odometry in a real-world setting.