DLRAD – A FIRST LOOK ON THE NEW VISION AND MAPPING BENCHMARK DATASET FOR AUTONOMOUS DRIVING
- 1German Aerospace Center (DLR), Remote Sensing Technology Institute, Wessling, Germany
- 2German Aerospace Center (DLR), Institute of Transportation Systems, Braunschweig, Germany
Keywords: Airborne camera, Vehicle sensors, Benchmark dataset, Autonomous driving, Sensor fusion
Abstract. DLRAD – a new vision and mapping benchmark dataset for autonomous driving is under development for the validation of intelligent driving algorithms. Stationary, mobile, and airborne sensors monitored simultaneously the environment around a reference vehicle, which was driving on urban, suburb and rural roads in and around the city of Braunschweig/Germany. Airborne images were acquired with the DLR 4k sensor system mounted on a helicopter. The DLR research car FASCarE is equipped with the latest sensor technology like front/rear radar, ultrasound and laser sensors, optical single and stereo cameras, and GNSS/IMU. Additionally, stationary terrestrial sensors like induction loops, optical mono and stereo cameras, radar and laser scanners monitor defined sections of the path from the ground. Simultaneously, the helicopter with the 4k sensor systems follows the reference car by keeping it all the time in the central nadir view. A next crucial step in the construction of the DLRAD benchmark dataset is the annotation of all objects in the reference dataset.
The DLRAD benchmark dataset enables a huge variety of validation capabilities and opens a wide field of possibilities for the development, training and validation of machine learning algorithms in the context of autonomous driving. In this paper, we will present details of the sensor configurations and the acquisition campaign, which had taken place between the 18th July and 20th July 2017 in Braunschweig/Germany. Also, we show a first analysis of the data including the completeness and geometrical quality. The dataset will be published as soon as the coregistration and annotations are complete.