Development of a Coaxial MAV with Real-Time Obstacle Avoidance Capability
Abstract
This paper discusses the development, implementation and deployment of a computationally light, yet robust real-time collision avoidance system on a hover capable coaxial rotary wing Micro Air Vehicle (MAV). The real-time capability of the algorithm is demonstrated by performing all computations required for processing the depth camera image onboard at 13 Hz using a single core of a 1.6 GHz quad core processor based single-board computer. The primary sensor used is Microsoft Kinect which has an operating range of 0.5 – 3.5 m. The algorithm uses template matching feature of an open source library OpenCV to find a window through which a vehicle of specified dimensions and known speed can pass safely. The actuator control and active yaw stabilization is done using Navstik which is a Micro Navigation & Control hardware. Two test vehicles are built and equipped with this framework for proof of concept.
Keywords
Full Text:
PDFReferences
Mueller, T. and DeLaurier, J., “Aerodynamics of Small Vehicles”, Annual Rev. of Fluid Mech., Vol. 35, 2003, pp. 89-111. CrossRef
Setu, S., “Development of Micro Air Vehicle with Real Time Obstacle Avoidance”, M. Tech. Thesis, Department of Aerospace Engineering, Indian Institute of Technology Kanpur, India, 2013.
Roberts, J., Stirling, T., Zufferey, J. C., and Floreano, D., “Quadrotor using minimal sensing for autonomous indoor flight”, Proceedings of the European Micro Air Vehicle Conference and Flight Competition (EMAV2007), Toulouse, France, September 17-21, 2007.
Borenstein, J., and Koren, Y., “Obstacle avoidance with ultrasonic sensors”, IEEE Journal of Robotics and Automation, Vol. 4, No. 2, 1988, pp. 213-218. CrossRef
Shim, D.H., Chung, H., and Sastry, S.S., “Conflict-free navigation in unknown urban environments”, Robotics & Automation Magazine, IEEE, Vol. 13.3, 2006, pp. 27-33. CrossRef
Mendes, A.; Bento, L.C.; Nunes, U., "Multi-target detection and tracking with a laser scanner," Intelligent Vehicles Symposium, 2004 IEEE , pp.796,801, 14-17 June 2004. CrossRef
Newman, P., Cole, D., & Ho, K., “Outdoor SLAM using visual appearance and laser ranging,” IEEE International Conference on Robotics and Automation, May 2006, pp. 1180-1187.
Biber, P., Andreasson, H., Duckett, T., and Schilling, A., “3D modeling of indoor environments by a mobile robot with a laser scanner and panoramic camera”, In Intelligent Robots and Systems, 2004.(IROS 2004). Proceedings. 2004 IEEE/RSJ International Conference on (Vol. 4, pp. 3430-3435). IEEE.
Brenneke, C., Wulf, O., and Wagner, B., “Using 3d laser range data for slam in outdoor environments”, In Intelligent Robots and Systems, October 2003.(IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on (Vol. 1, pp. 188-193). IEEE.
Guivant, J., Nebot, E., and Baiker, S., “Autonomous navigation and map building using laser range sensors in outdoor applications”, Journal of robotic systems, Vol. 17, No.10, 2000, pp. 565-583. CrossRef
Kim, S., and Oh, S. Y., “SLAM in indoor environments using omni-directional vertical and horizontal line features”, Journal of Intelligent and Robotic Systems, Vol. 51, No.1, 2008, pp. 31-43. CrossRef
Nelson, R.C.; Aloimonos, J., "Obstacle avoidance using flow field divergence," IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.11, No.10, Oct 1989, pp.1102 -1106. CrossRef
Green, W.E., and Oh, P.Y., “Optic-flow-based collision avoidance”, Robotics and Automation Magazine, IEEE, Vol. 15.1, 2008, pp. 96-103. CrossRef
Green, W.E and Oh, P.Y. and Barrows, G., "Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments," Robotics and Automation, 2004. Proceedings. ICRA '04. 2004 IEEE International Conference on , Vol.3, pp.2347,2352 Vol.3, 26 April-1 May 2004. CrossRef
Hyslop, Andrew M., and J. Sean Humbert. "Autonomous navigation in three-dimensional urban environments using wide-field integration of optic flow." Journal of Guidance, Control, and Dynamics, Vol. 33, No. 1, 2010, pp. 147-159. CrossRef
Conroy, J., Gremillion, G., Ranganathan, B., and Humbert, J. S., “Implementation of wide-field integration of optic flow for autonomous quadrotor navigation”, Autonomous Robots, Vol. 27, No. 3, 2009, pp. 189-198. CrossRef
Benavidez, P., and Jamshidi, M., “Mobile robot navigation and target tracking system”, 6th International Conference on System of Systems Engineering (SoSE), Albuquerque, NM, June 27-30, 2011.
Biswas, J., and Veloso, M., “Depth camera based indoor mobile robot localization and navigation”, IEEE International Conference on Robotics and Automation (ICRA), Saint Paul, MN, May 14-18, 2012.
Correa, D.S.O., Sciotti, D.F., Prado, M.G., Sales, D.O., Wolf, D.F., and Osório, F.S., “Mobile robots navigation in indoor environments using kinect sensor”, Second Brazilian Conference on Critical Embedded Systems (CBSEC), Campinas, May 20-25, 2012.
Flacco, F., Kroger, T., De Luca, A., and Khatib, O., “A depth space approach to human-robot collision avoidance”, IEEE International Conference on Robotics and Automation (ICRA), Saint Paul, MN, May 14-18, 2012.
“PrimeSense Supplies 3-D-Sensing Technology to Project Natal for Xbox 360”, Microsoft Press Release, retrieved August 31, 2011. VIEW ITEM
URL: openkinect.org/wiki/Imaging_Information retrieved on June 21, 2013.
Abhishek and Setu, S., “Preliminary design of collision avoidance system for micro air vehicle”, 7th International conference on Intelligent Unmanned Systems, Chiba, Japan, Oct. 31 – Nov. 2, 2011.
Khoshelham, K. and Elberink, S.O., "Accuracy and resolution of kinect depth data for indoor mapping applications." Sensors, Vol. 12.2, 2012, pp. 1437-1454. CrossRef
Refbacks
- There are currently no refbacks.

This work is licensed under a Creative Commons Attribution 3.0 License.