Publication Details
Overview
 
 
Kheireddine Aziz, E. De Greef, Maxim Rykunov, Andre Bourdoux, Hichem Sahli
 

2020 IEEE Radar Conference (RadarConf20)

Contribution To Book Anthology

Abstract 

This paper presents a radar and camera sensor fusion framework as a vulnerable road user (VRU) perception system that can automatically detect, track and classify different targets on the road. The first module of the system performs a spatial-temporal alignment on a common plane of detections provided by the radar signal processing and video processing modules. The second module is dedicated to data association of the aligned detections. A centralized fusion algorithm takes the current aligned detection set (locations and labels) as inputs from both sensors and performs multi-object tracking with a joint probabilistic data association (JPDAF) algorithm underlying the Kalman filter. The proposed radar/camera fusion system is experimentally evaluated through multi-object tracking scenarios. The experimental results demonstrate its reliability and effectiveness compared to a single sensor system

Reference 
 
 
DOI scopus