A Conceptual Architecture for Resilient UAV Navigation in GNSS-Degraded Environments

Sangwoo Jeon, Vishnu Kumar Kaliappan

Abstract


Global Navigation Satellite System (GNSS) signals are often unreliable or unavailable in urban canyons, dense forests, or indoor environments, posing significant challenges for UAV navigation. This paper proposes a conceptual architecture for resilient UAV navigation that integrates multi-sensor fusion, adaptive filtering, and environment-aware localization strategies to maintain accurate state estimation under GNSS-degraded conditions. The architecture is modular, combining inertial measurement units (IMUs), vision-based odometry, LiDAR, and barometric altimeters within a hierarchical fusion framework. We introduce the concept of “sensor trust modulation,” where the relative confidence in each sensor modality dynamically adjusts based on environmental context and sensor health diagnostics. The architecture also incorporates a fault detection and isolation (FDI) layer to identify and mitigate sensor anomalies in real-time. Theoretical foundations are laid for observability analysis in multi-sensor fusion systems, emphasizing the importance of persistent excitation and redundancy. By formalizing the interplay between sensor modalities and environmental constraints, this framework aims to guide the design of robust navigation systems capable of seamless operation across diverse mission profiles. The paper concludes with a discussion on the implications for autonomous mission planning and the potential for integration with emerging technologies such as event-based cameras and deep learning-based localization.

References


Forster, Christian, Matia Pizzoli, and Davide Scaramuzza. “SVO: Fast Semi-Direct Monocular Visual Odometry”. In 2014 IEEE International Conference on Robotics and Automation (ICRA), 15–22, 2014. https://doi.org/10.1109/ICRA.2014.6906584.

Huang, Wei, and Xiaoxin Su. “Design of a Fault Detection and Isolation System for Intelligent Vehicle Navigation System”. International Journal of Navigation and Observation 2015, no. 1 (2015): 279086. https://doi.org/10.1155/2015/279086.

Hwang, Inseok, Sungwan Kim, Youdan Kim, and Chze Eng Seah. “A Survey of Fault Detection, Isolation, and Reconfiguration Methods”. IEEE Transactions on Control Systems Technology 18, no. 3 (2010): 636–53. https://doi.org/10.1109/TCST.2009.2026285.

Leutenegger, Stefan, Simon Lynen, Michael Bosse, Roland Siegwart, and Paul Furgale. “Keyframe-based Visual-Inertial Odometry using Nonlinear Optimization”. The International Journal of Robotics Research 34, no. 3 (2015): 314–34. https://doi.org/10.1177/0278364914554813.

Lynen, Simon, Markus W. Achtelik, Stephan Weiss, Margarita Chli, and Roland Siegwart. “A Robust and Modular Multi-Sensor Fusion Approach Applied to MAV Navigation”. In 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, 3923–29, 2013. https://doi.org/10.1109/IROS.2013.6696917.

Mourikis, Anastasios I., and Stergios I. Roumeliotis. “A Multi-State Constraint Kalman Filter for Vision-Aided Inertial Navigation”. In Proceedings 2007 IEEE International Conference on Robotics and Automation, 3565–72, 2007. https://doi.org/10.1109/ROBOT.2007.364024.

Scaramuzza, Davide, and Friedrich Fraundorfer. “Visual Odometry [Tutorial]”. IEEE Robotics & Automation Magazine 18, no. 4 (2011): 80–92. https://doi.org/10.1109/MRA.2011.943233.

Weiss, Stephan, Markus W. Achtelik, Simon Lynen, Margarita Chli, and Roland Siegwart. “Real-Time Onboard Visual-Inertial State Estimation and Self-Calibration of MAVs in Unknown Environments”. In 2012 IEEE International Conference on Robotics and Automation, 957–64, 2012. https://doi.org/10.1109/ICRA.2012.6225147.

Yang, Ling, Yong Li, Youlong Wu, and Chris Rizos. “An Enhanced MEMS-INS/GNSS Integrated System with Fault Detection and Exclusion Capability for Land Vehicle Navigation in Urban Areas”. GPS Solutions 18, no. 4 (1 October 2014): 593–603. https://doi.org/10.1007/s10291-013-0357-1.

Zhang, Ji, Sanjiv Singh, and Others. “LOAM: Lidar Odometry and Mapping in Real-Time”. In Robotics: Science and Systems, 2:1–9. Berkeley, CA, 2014. https://doi.org/10.15607/RSS.2014.X.007.


Refbacks

  • There are currently no refbacks.




Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.