ENGLISH / MAGYAR
Kövess
minket

Weather-Adaptive Multi-View 4D Radar and Camera Fusion for Robust Autonomous Driving Perception

2025-2026/II.
Dr. Liu Chang

Autonomous vehicles require perception systems that are reliable under all environmental conditions. Traditional LiDAR-camera fusion pipelines often degrade severely in adverse weather such as snow, rain, or heavy fog. The emergence of 4D radar sensors, which are robust to weather disturbances and provide velocity-aware 3D point clouds, enables multi-view fusion strategies for challenging conditions.

This lab focuses on enhancing multi-view perception pipelines by integrating 4D radar, aiming to improve robustness in adverse weather. Students will first implement a multi-view fusion baseline using cameras, and then extend it with 4D radar data to develop a weather-adaptive perception model that dynamically adjusts sensor contributions according to environmental conditions.

Objectives

Implement a multi-view camera fusion baseline for 3D object detection under adverse weather using the nuScenes dataset.
Integrate 4D radar into the multi-view fusion pipeline using TJ4DRadSet or K-Radar datasets.
Develop a weather-adaptive perception model that dynamically prioritizes sensors based on environmental conditions (fog, rain, snow).
Benchmark the real-time performance and robustness of the multi-view fusion model across different weather scenarios.

Note:Applicant will use publicly available datasets and receive technical support from SZTAKI.

Supervisor at the department: Dr. Chang Liu, Assistant Professor

External supervisor:   Prof. Tamás Szirányi, HUN-REN SZTAKI.


1
1