Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XL-4, 181-186, 2014
http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-4/181/2014/
doi:10.5194/isprsarchives-XL-4-181-2014
© Author(s) 2014. This work is distributed
under the Creative Commons Attribution 3.0 License.
 
23 Apr 2014
Panoramic rendering-based polygon extraction from indoor mobile LiDAR data
M. Nakagawa1, K. Kataoka1, T. Yamamoto1, M. Shiozaki2, and T. Ohhashi2 1Dept. of Civil Engineering, Shibaura Institute of Technology, Tokyo, Japan
2Nikon Trimble Co., Ltd., Tokyo, Japan
Keywords: Indoor mobile mapping, Point-cloud, Point-based rendering, Point cloud clustering, 3D polygon extraction Abstract. In this paper, we propose a method for panoramic point-cloud rendering-based polygon extraction from indoor mobile LiDAR data. Our aim was to improve region-based point-cloud clustering in modeling after point-cloud registration. First, we propose a pointcloud clustering methodology for polygon extraction on a panoramic range image generated with point-based rendering from a massive point cloud. Next, we describe an experiment that was conducted to verify our methodology with an indoor mobile mapping system in an indoor environment. This experiment was wall-surface extraction using a rendered point-cloud from 64 viewpoints over a wide indoor area. Finally, we confirmed that our proposed methodology could achieve polygon extraction through point-cloud clustering from a complex indoor environment.
Conference paper (PDF, 1139 KB)


Citation: Nakagawa, M., Kataoka, K., Yamamoto, T., Shiozaki, M., and Ohhashi, T.: Panoramic rendering-based polygon extraction from indoor mobile LiDAR data, Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XL-4, 181-186, doi:10.5194/isprsarchives-XL-4-181-2014, 2014.

BibTeX EndNote Reference Manager XML