SYNTHETIC VISION SYSTEM CALIBRATION FOR CONFORM PROJECTION ON THE PILOT’S HEAD-UP DISPLAY
Keywords: camera calibration, external orientation, head-up display, synthetic vision
Abstract. Situational awareness of the crew is critical for the safety of the air flight. Head-up display allows providing all required flight information in front of the pilot over the cockpit view visible through the cockpit’s front window. This device has been created for solving the problem of informational overload during piloting of an aircraft. While computer graphics such as scales and digital terrain model can be easily presented on the such display, errors in the Head-up display alignment for correct presenting of sensor data pose challenges. The main problem arises from the parallax between the pilot’s eyes and the position of the camera. This paper is focused on the development of an online calibration algorithm for conform projection of the 3D terrain and runway models on the pilot’s head-up display. The aim of our algorithm is to align the objects visible through the cockpit glass with their projections on the Head-up display. To improve the projection accuracy, we use an additional optical sensor installed on the aircraft. We combine classical photogrammetric techniques with modern deep learning approaches. Specifically, we use an object detection neural network model to find the runway area and align runway projection with its actual location. Secondly, we re-project the sensor’s image onto the 3D model of the terrain to eliminate errors caused by the parallax. We developed an environment simulator to evaluate our algorithm. Using the simulator we prepared a large training dataset. The dataset includes 2000 images of video sequences representing aircraft’s motion during takeoff, landing and taxi. The results of the evaluation are encouraging and demonstrate both qualitatively and quantitatively that the proposed algorithm is capable of precise alignment of the 3D models projected on a Head-up display.