The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XLVI-3/W1-2022
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLVI-3/W1-2022, 169–175, 2022
https://doi.org/10.5194/isprs-archives-XLVI-3-W1-2022-169-2022
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLVI-3/W1-2022, 169–175, 2022
https://doi.org/10.5194/isprs-archives-XLVI-3-W1-2022-169-2022
 
22 Apr 2022
22 Apr 2022

INDOOR ATTITUDE ESTIMATION USING EQUIPPED GYROSCOPES AND DEPTH SENSORS

Q. Shi1,2, Z. Song1, Z. Xiao2, S. Chen2, and F. Wang3 Q. Shi et al.
  • 1Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China
  • 2Orbbec Inc., Shenzhen 518062, China
  • 3Oxin Inc., Shenzhen 518000, China

Keywords: Attitude estimation, Indoor localization, Sensor fusion, Depth sensor, Error-state Kalman filter, Manhattan world

Abstract. Attitude estimation is central to a wide range of applications such as robotics, virtual reality and mobile smart devices. With the development of sensor technologies, these application devices are often equipped with gyroscopes and depth sensors. In this paper, we propose a novel method to fuse gyroscope and depth information for drift-free and robust attitude estimation in structured indoor applications. Our method relies on the depth information and the Manhattan world assumption to estimate the absolute orientation, which is then fused to correct the accumulated error of the gyroscope-determined attitude. We first utilize the mean shift algorithm on the unit sphere to align the surface normals from the depth measurements with the orthogonal planar structures of the Manhattan world. Therefore, the orientation estimates are drift-free and absolute with respect to the Manhattan world. We then fuse the orientation estimates with the gyroscope measurements in an error-state Kalman filter manner to further improve the attitude estimation accuracy and robustness. We validate the performance of our method on public datasets, demonstrating the robustness and accuracy of the method for attitude estimation.