Event-Based Vision for High-Speed Autonomous Navigation: Algorithms, Hardware, and Applications

S. Gnanamurthy, Ravi Samikannu

Abstract


Event-based vision sensors, inspired by biological retinas, asynchronously detect per-pixel brightness changes with microsecond temporal resolution and high dynamic range, offering transformative potential for high-speed autonomous navigation where conventional frame-based cameras suffer from motion blur and limited frame rates. This paper comprehensively examines event-based vision algorithms, hardware platforms, and applications for unmanned systems operating in dynamic, high-speed scenarios. We systematically analyze the fundamental principles of event cameras including asynchronous pixel-level operation, sparse output encoding only brightness changes, and logarithmic intensity response enabling operation across extreme lighting conditions. The review categorizes algorithmic approaches for processing event streams: frame-based methods accumulating events into image-like representations for compatibility with conventional computer vision, event-based methods directly processing asynchronous data through spiking neural networks or time-surface representations, and hybrid approaches combining both paradigms. We examine key perception tasks including feature tracking exploiting high temporal resolution for robust correspondence, optical flow estimation from event timing information, object recognition through event-based convolutional networks, and simultaneous localization and mapping (SLAM) leveraging continuous motion information. Particular emphasis is placed on algorithms exploiting event camera advantages: high-speed motion estimation for fast-moving platforms, high dynamic range scene understanding in challenging lighting, and low-latency processing for reactive control. The paper analyzes hardware platforms spanning commercial event cameras (DVS, DAVIS, Prophesee), neuromorphic processors for efficient event processing (Loihi, TrueNorth), and hybrid systems combining event and frame-based sensors. Application-specific implementations are examined for high-speed UAV racing requiring rapid obstacle avoidance, space robotics with extreme lighting variations, and automotive scenarios with HDR requirements. We critically evaluate challenges including event noise in low-contrast scenes, calibration complexity, and limited availability of labeled training data. The review identifies research gaps in end-to-end learning frameworks and standardized benchmarks for event-based navigation.

Keywords


event-based vision, neuromorphic sensors, high-speed navigation, dynamic vision sensors.

References


Alzugaray, Ignacio, and Margarita Chli. “Asynchronous Corner Detection and Tracking for Event Cameras in Real Time”. IEEE Robotics and Automation Letters 3, no. 4 (2018): 3177–84. https://doi.org/10.1109/LRA.2018.2849882.

Brändli, Christian, Jonas Strubel, Susanne Keller, Davide Scaramuzza, and Tobi Delbruck. “ELiSeD — An Event-Based Line Segment Detector”. In 2016 Second International Conference on Event-Based Control, Communication, and Signal Processing (EBCCSP), 1–7, 2016. https://doi.org/10.1109/EBCCSP.2016.7605244.

Gehrig, Daniel, Antonio Loquercio, Konstantinos G. Derpanis, and Davide Scaramuzza. “End-to-End Learning of Representations for Asynchronous Event-Based Data”. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2019. Available at https://openaccess.thecvf.com/content_ICCV_2019/html/Gehrig_End-to-End_Learning_of_Representations_for_Asynchronous_Event-Based_Data_ICCV_2019_paper.html.

Kueng, Beat, Elias Mueggler, Guillermo Gallego, and Davide Scaramuzza. “Low-Latency Visual Odometry Using Event-Based Feature Tracks”. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 16–23, 2016. https://doi.org/10.1109/IROS.2016.7758089.

Mitrokhin, Anton, Chengxi Ye, Cornelia Fermüller, Yiannis Aloimonos, and Tobi Delbruck. “EV-IMO: Motion Segmentation Dataset and Learning Pipeline for Event Cameras”. In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 6105–12, 2019. https://doi.org/10.1109/IROS40897.2019.8968520.

Mueggler, Elias, Guillermo Gallego, Henri Rebecq, and Davide Scaramuzza. “Continuous-Time Visual-Inertial Odometry for Event Cameras”. IEEE Transactions on Robotics 34, no. 6 (2018): 1425–40. https://doi.org/10.1109/TRO.2018.2858287.

Mueggler, Elias, Henri Rebecq, Guillermo Gallego, Tobi Delbruck, and Davide Scaramuzza. “The Event-Camera Dataset and Simulator: Event-Based Data for Pose Estimation, Visual Odometry, and SLAM”. The International Journal of Robotics Research 36, no. 2 (2017): 142–49. https://doi.org/10.1177/0278364917691115.

Müggler, Elias; Gallego, Guillermo; Scaramuzza, Davide (2015). “Continuous-time trajectory estimation for event-based vision sensors”. In Robotics: Science and Systems (RSS), Rome, 13 July 2015 - 17 July 2015, s.n.. https://doi.org/10.5167/uzh-125444.

Sironi, Amos, Manuele Brambilla, Nicolas Bourdis, Xavier Lagorce, and Ryad Benosman. “HATS: Histograms of Averaged Time Surfaces for Robust Event-Based Object Classification”. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2018. Available at https://openaccess.thecvf.com/content_cvpr_2018/html/Sironi_HATS_Histograms_of_CVPR_2018_paper.html.

Tedaldi, David, Guillermo Gallego, Elias Mueggler, and Davide Scaramuzza. “Feature Detection and Tracking with the Dynamic and Active-Pixel Vision Sensor (DAVIS)”. In 2016 Second International Conference on Event-Based Control, Communication, and Signal Processing (EBCCSP), 1–7, 2016. https://doi.org/10.1109/EBCCSP.2016.7605086.

Vidal, Antoni Rosinol, Henri Rebecq, Timo Horstschaefer, and Davide Scaramuzza. “Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios”. IEEE Robotics and Automation Letters 3, no. 2 (2018): 994–1001. https://doi.org/10.1109/LRA.2018.2793357.

Zhu, Alex, Liangzhe Yuan, Kenneth Chaney, and Kostas Daniilidis. “EV-FlowNet: Self-Supervised Optical Flow Estimation for Event-Based Cameras”. In Robotics: Science and Systems XIV. RSS2018. Robotics: Science and Systems Foundation, 2018. https://doi.org/10.15607/rss.2018.xiv.062.


Refbacks

  • There are currently no refbacks.




Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.