The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XLIII-B1-2021
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B1-2021, 215–220, 2021
https://doi.org/10.5194/isprs-archives-XLIII-B1-2021-215-2021
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B1-2021, 215–220, 2021
https://doi.org/10.5194/isprs-archives-XLIII-B1-2021-215-2021

  28 Jun 2021

28 Jun 2021

IMAGE-BASED ORIENTATION DETERMINATION OF MOBILE SENSOR PLATFORMS

O. Hasler and S. Nebiker O. Hasler and S. Nebiker
  • Institute of Geomatics Engineering, FHNW University of Applied Sciences and Arts Northwestern Switzerland, Muttenz, Switzerland

Keywords: orientation estimation, image-based, panorama images, robotics, mobile sensor platforms

Abstract. Estimating the pose of a mobile robotic platform is a challenging task, especially when the pose needs to be estimated in a global or local reference frame and when the estimation has to be performed while the platform is moving. While the position of a platform can be measured directly via modern tachymetry or with the help of a global positioning service GNSS, the absolute platform orientation is harder to derive. Most often, only the relative orientation is estimated with the help of a sensor mounted on the robotic platform such as an IMU, with one or multiple cameras, with a laser scanner or with a combination of any of those. Then, a sensor fusion of the relative orientation and the absolute position is performed. In this work, an additional approach is presented: first, an image-based relative pose estimation with frames from a panoramic camera using a state-of-the-art visual odometry implementation is performed. Secondly, the position of the platform in a reference system is estimated using motorized tachymetry. Lastly, the absolute orientation is calculated using a visual marker, which is placed in the space, where the robotic platform is moving. The marker can be detected in the camera frame and since the position of this marker is known in the reference system, the absolute pose can be estimated. To improve the absolute pose estimation, a sensor fusion is conducted. Results with a Lego model train as a mobile platform show, that the trajectory of the absolute pose calculated independently with four different markers have a deviation < 0.66 degrees 50% of the time and that the average difference is < 1.17 degrees. The implementation is based on the popular Robotic Operating System ROS.