DETERMINATION OF UAS TRAJECTORY IN A KNOWN ENVIRONMENT FROM FPV VIDEO
- Geomatics Engineering, GeoICT Lab, Department of Earth and Space Science and Engineering, Lassonde School of Engineering, York University, Toronto, Ontario, M3J 1P3 Canada
Keywords: Unmanned Aerial Systems, First Person View video, Georeferencing, Map-based Navigation
Abstract. This paper presents a novel self-localization method. The algorithm automatically establishes correspondence between the FPV video streamed from a UAS flying in a structured urban environment and its 3D model. The resulting camera pose provides a precise navigation solution in the densely crowded environment. Initially, Vertical Line Features are extracted from a streamed FPV video frame, as the camera is kept approximately leveled through a gimbal system. The features are then matched with Vertical Line Features extracted from a synthetic image of the 3D model. A space resection is performed to provide the EOPs for this frame. The features are tracked in the next frame, followed by an incremental triangulation. The main contribution of this paper lies in automating this process as an FPV video sequence typically consists of thousands of frames. Accuracies of the position and orientation parameters of the video camera and the validation checks of the estimated parameters are presented. Future work includes testing the method in real-time to determine latencies and reliability, and multi-directional field of view of the FPV video camera.