The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Download
Citation
Articles | Volume XLIII-B1-2022
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B1-2022, 257–262, 2022
https://doi.org/10.5194/isprs-archives-XLIII-B1-2022-257-2022
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B1-2022, 257–262, 2022
https://doi.org/10.5194/isprs-archives-XLIII-B1-2022-257-2022
 
30 May 2022
30 May 2022

AUTOMOTIVE RADAR BASED LEAN DETECTION OF VEHICLES

A. Moussa1,2 and N. El-Sheimy1 A. Moussa and N. El-Sheimy
  • 1Dept. of Geomatics Engineering, University of Calgary, Calgary, T2N 1N4, Canada
  • 2Department of Electrical Engineering, Port-Said University, Port Said, Egypt

Keywords: Vehicle, Detection, Automotive Radar

Abstract. One of the most critical features of autonomous vehicles is the detection of road active objects such as vehicles and pedestrians. The autonomous vehicles’ navigation planning and manoeuvre decision-making are aided by the detection of such active objects, resulting in safe and efficient navigation. Deep Convolutional Neural Networks (CNNs) have recently advanced to become one of the state-of-the-art ways to solving detection challenges, particularly in the autonomous vehicle area. Deep CNNs typically use a large number of processing layers with a high number of kernels per layer to enable detection of the target classes which also demands the use of powerful hardware units. In this research, we present a tailored lean detection strategy for vehicle detection using radar observations. The proposed method employs a compact set of convolutions, as well as pixel classification and a customized selection of kernels and kernel sizes, to provide an efficient technique that greatly decreases detection burden and enables real-time processing on average processing units. A training dataset is used to train the convolution window sizes and the pixel classifiers. Finally, the pixel classified grids are processed to identify the vehicles' bounding boxes. Experimental data sets have been collected using medium-range radar sensors mounted on top of a vehicle to evaluate the suggested approach, the Intersection over Union (IoU) values of the test scenes’ detections range from 0.51 to 0.78.