Volume XL-4/W5
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XL-4/W5, 113-118, 2015
https://doi.org/10.5194/isprsarchives-XL-4-W5-113-2015
© Author(s) 2015. This work is distributed under
the Creative Commons Attribution 3.0 License.
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XL-4/W5, 113-118, 2015
https://doi.org/10.5194/isprsarchives-XL-4-W5-113-2015
© Author(s) 2015. This work is distributed under
the Creative Commons Attribution 3.0 License.

  11 May 2015

11 May 2015

USE OF ASSISTED PHOTOGRAMMETRY FOR INDOOR AND OUTDOOR NAVIGATION PURPOSES

D. Pagliari, N. E. Cazzaniga, and L. Pinto D. Pagliari et al.
  • DICA-Dept. of Civil and Environmental Engineering, Politecnico di Milano, Milan, Italy

Keywords: outdoor, indoor, navigation, photogrammetry, Kinect, urban maps, GNSS, depth images

Abstract. Nowadays, devices and applications that require navigation solutions are continuously growing. For instance, consider the increasing demand of mapping information or the development of applications based on users’ location. In some case it could be sufficient an approximate solution (e.g. at room level), but in the large amount of cases a better solution is required.

The navigation problem has been solved from a long time using Global Navigation Satellite System (GNSS). However, it can be unless in obstructed areas, such as in urban areas or inside buildings. An interesting low cost solution is photogrammetry, assisted using additional information to scale the photogrammetric problem and recovering a solution also in critical situation for image-based methods (e.g. poor textured surfaces). In this paper, the use of assisted photogrammetry has been tested for both outdoor and indoor scenarios. Outdoor navigation problem has been faced developing a positioning system with Ground Control Points extracted from urban maps as constrain and tie points automatically extracted from the images acquired during the survey. The proposed approach has been tested under different scenarios, recovering the followed trajectory with an accuracy of 0.20 m.

For indoor navigation a solution has been thought to integrate the data delivered by Microsoft Kinect, by identifying interesting features on the RGB images and re-projecting them on the point clouds generated from the delivered depth maps. Then, these points have been used to estimate the rotation matrix between subsequent point clouds and, consequently, to recover the trajectory with few centimeters of error.