This paper presents a radar and camera sensor fusion framework as a vulnerable road user (VRU) perception system that can automatically detect, track and classify different targets on the road. The first module of the system performs a spatial-temporal alignment on a common plane of detections provided by the radar signal processing and video processing modules. The second module is dedicated to data association of the aligned detections. A centralized fusion algorithm takes the current aligned detection set (locations and labels) as inputs from both sensors and performs multi-object tracking with a joint probabilistic data association (JPDAF) algorithm underlying the Kalman filter. The proposed radar/camera fusion system is experimentally evaluated through multi-object tracking scenarios. The experimental results demonstrate its reliability and effectiveness compared to a single sensor system
Aziz, K, De Greef, E, Rykunov, M, Bourdoux, A & Sahli, H 2020, Radar-camera Fusion for Road Target Classification. in 2020 IEEE Radar Conference (RadarConf20)., 9266510, IEEE, pp. 1-6. https://doi.org/10.1109/RadarConf2043947.2020.9266510
Aziz, K., De Greef, E., Rykunov, M., Bourdoux, A., & Sahli, H. (2020). Radar-camera Fusion for Road Target Classification. In 2020 IEEE Radar Conference (RadarConf20) (pp. 1-6). Article 9266510 IEEE. https://doi.org/10.1109/RadarConf2043947.2020.9266510
@inproceedings{b984dd6555aa47379bdaf865c9dd718e,
title = "Radar-camera Fusion for Road Target Classification",
abstract = "This paper presents a radar and camera sensor fusion framework as a vulnerable road user (VRU) perception system that can automatically detect, track and classify different targets on the road. The first module of the system performs a spatial-temporal alignment on a common plane of detections provided by the radar signal processing and video processing modules. The second module is dedicated to data association of the aligned detections. A centralized fusion algorithm takes the current aligned detection set (locations and labels) as inputs from both sensors and performs multi-object tracking with a joint probabilistic data association (JPDAF) algorithm underlying the Kalman filter. The proposed radar/camera fusion system is experimentally evaluated through multi-object tracking scenarios. The experimental results demonstrate its reliability and effectiveness compared to a single sensor system ",
author = "Kheireddine Aziz and {De Greef}, E. and Maxim Rykunov and Andre Bourdoux and Hichem Sahli",
year = "2020",
month = sep,
day = "21",
doi = "10.1109/RadarConf2043947.2020.9266510",
language = "English",
pages = "1--6",
booktitle = "2020 IEEE Radar Conference (RadarConf20)",
publisher = "IEEE",
}