Multi-Sensor Fusion Strategies for All-Weather Autonomous Navigation: A Systematic Review

Ravi Samikannu, Jueying Li

Abstract


Autonomous navigation in all-weather conditions requires robust sensor fusion strategies that maintain performance despite degraded visibility, precipitation, temperature extremes, and sensor-specific failure modes that vary across environmental conditions. This systematic review comprehensively examines multi-sensor fusion architectures that enable reliable navigation across diverse weather scenarios including fog, rain, snow, dust storms, and extreme temperatures. We analyze the complementary characteristics of primary navigation sensors: cameras vulnerable to low visibility but providing rich semantic information, LiDAR affected by precipitation but offering precise ranging, radar penetrating adverse weather but with lower resolution, thermal cameras operating in darkness but sensitive to temperature gradients, and inertial measurement units providing weather-independent dead reckoning with drift accumulation. The review systematically categorizes fusion strategies into centralized architectures processing raw sensor data jointly, decentralized approaches with independent sensor processing and decision fusion, and hierarchical frameworks combining both paradigms. We examine algorithmic approaches spanning Kalman filtering variants (EKF, UKF, IEKF) for tightly-coupled integration, particle filters for non-Gaussian distributions, factor graph optimization for batch processing with loop closures, and deep learning-based fusion networks that learn optimal sensor weighting. Particular emphasis is placed on adaptive fusion strategies that dynamically adjust sensor contributions based on detected environmental conditions and individual sensor reliability metrics. The review analyzes weather-specific degradation models for each sensor modality: raindrop effects on camera images, LiDAR point cloud sparsification in fog, radar clutter from precipitation, and thermal imaging challenges in high-humidity conditions. We evaluate sensor selection and placement strategies optimizing coverage redundancy while managing cost, weight, and power constraints. The paper examines fault detection and isolation mechanisms that identify sensor malfunctions or environmental degradation, enabling graceful degradation rather than catastrophic failure. Advanced techniques are reviewed including evidential reasoning for uncertainty quantification, Dempster-Shafer theory for handling conflicting sensor information, and meta-learning approaches that rapidly adapt fusion parameters to novel weather conditions. Application-specific considerations are discussed across domains: autonomous ground vehicles navigating in rain and snow, UAVs operating in variable visibility, and marine vessels handling sea spray and fog. The review synthesizes validation methodologies including controlled environment testing, synthetic weather injection in simulation, and real-world dataset analysis across seasonal variations. We identify critical research gaps including the scarcity of multi-modal datasets with ground truth in adverse weather, limited theoretical frameworks for fusion performance guarantees under degraded conditions, and insufficient attention to computational efficiency for embedded deployment. The paper concludes with architectural recommendations and a roadmap for developing truly all-weather autonomous navigation systems.

Keywords


sensor fusion, all-weather navigation, autonomous vehicles, robust perception.

References


Bansal, Kshitiz, Keshav Rungta, and Dinesh Bharadia. “RadSegNet: A Reliable Approach to Radar Camera Fusion”. arXiv [Cs.CV], 2022. arXiv. http://arxiv.org/abs/2208.03849.

Cui, Can, Yunsheng Ma, Juanwu Lu, and Ziran Wang. “Radar Enlighten the Dark: Enhancing Low-Visibility Perception for Automated Vehicles with Camera-Radar Fusion”. arXiv [Cs.CV], 2023. arXiv. http://arxiv.org/abs/2305.17318.

Fayyad, Jamil, Mohammad A. Jaradat, Dominique Gruyer, and Homayoun Najjaran. “Deep Learning Sensor Fusion for Autonomous Vehicle Perception and Localization: A Review”. Sensors 20, no. 15 (2020): 4220. https://doi.org/10.3390/s20154220.

Hasanujjaman, Muhammad, Mostafa Z. Chowdhury, and Yeong M. Jang. “Sensor Fusion in Autonomous Vehicle with Traffic Surveillance Camera System: Detection, Localization, and AI Networking”. Sensors 23, no. 6 (2023): 3335. https://doi.org/10.3390/s23063335.

Ogunrinde, Isaac, and Shonda Bernadin. “Deep Camera–Radar Fusion with an Attention Framework for Autonomous Vehicle Vision in Foggy Weather Conditions”. Sensors 23, no. 14 (2023): 6255. https://doi.org/10.3390/s23146255.

Sezgin, Fatih, Daniel Vriesman, Dagmar Steinhauser, Robert Lugner, and Thomas Brandmeier. “Safe Autonomous Driving in Adverse Weather: Sensor Evaluation and Performance Monitoring”. arXiv [Cs.RO], 2023. arXiv. http://arxiv.org/abs/2305.01336.

Srivastav, Arvind, and Soumyajit Mandal. “Radars for Autonomous Driving: A Review of Deep Learning Methods and Challenges”. arXiv [Cs.CV], 2023. arXiv. http://arxiv.org/abs/2306.09304.

Yeong, De J., Gustavo Velasco-Hernandez, John Barry, and Joseph Walsh. “Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review”. Sensors 21, no. 6 (2021): 2140. https://doi.org/10.3390/s21062140.

Zhang, Yuxiao, Alexander Carballo, Hanting Yang, and Kazuya Takeda. “Perception and Sensing for Autonomous Vehicles under Adverse Weather Conditions: A Survey”. ISPRS Journal of Photogrammetry and Remote Sensing 196 (1 February 2023): 146–77. https://doi.org/10.1016/j.isprsjprs.2022.12.021.


Refbacks

  • There are currently no refbacks.




Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.