POINT CLOUD TRANSFORMATION USING SENSOR CALIBRATION INFORMATION FOR MAP DATA ADJUSTMENT
- 1Yonsei University, Civil & Environmental Engineering Faculty, Yonsei-ro 50 Seoul, Korea
- 2Stryx. co, CTO, Yonsei-ro 50-1 Seoul, Korea
Keywords: Point cloud, Mobile mapping system, Calibration, High-precision map, LiDAR, Bird eye’s view
Abstract. In order to operate autonomous vehicles and unmanned delivery vehicle, it is important to accurately acquire location of the device itself. However, since these devices are mainly operated in urban areas, there is a limit in obtaining location information based on GNSS. Therefore, it is necessary to utilize a method of calibrating its own location information by measuring the reference point provided by the existing high-precision map of the region. Point cloud based multi-dimensional high-precision maps are acquired in advance using high-performance LiDAR and GNSS devices for infrastructure such as roads, and provide a reference point for autonomous driving or map updating. Since such high-performance surveying equipment requires high cost, it is difficult to attach to autonomous vehicles or unmanned vehicle for commercialization. Therefore, autonomous vehicles or unmanned delivery vehicle are operated with relatively low performance LiDAR and GNSS, so it is often impossible to accurately measure the reference point, which directly leads to a decrease in the accuracy of the location information of the device. To compensate for this, this study proposes a point interpolation method to extract GCP information from sparse point cloud maps acquired with low performance LiDAR. The proposed method uses calibration parameters between point data and the image data acquired from the device. In general, images provide higher resolution than point clouds, even when using low-end cameras, so that the position of point coordinates relative to a reference point can be measured relatively accurately from the image and projection data of the point cloud. The data acquisition vehicle is an MMS vehicle that provides a panoramic image using four DSLRs and a point cloud with Velodyne VLP 16. The researchers first conducted a reference point survey on features such as road signs. The panorama image including the road sign was transformed into a bird eye’s view, and point projection was performed on the bird eye’s view image. The reference point coordinates, which were not acquired by the point cloud, were obtained from the shape of the road sign in the bird eye’s view image, and the accuracy was compared with the measured data.