MinNav : Minimalist Navigation Using Optical Flow For Active Tiny Aerial Robots



Aniket Patil1 Mandeep Singh1 Uday Girish Maradana1 Nitin J. Sanket1


1 Perception and Autonomous Robotics Group (PeAR)
at
Worcester Polytechnic Institute


Abstract


Navigation using a monocular camera is pivotal for autonomous operation on tiny aerial robots due to their perfect balance of versatility, cost and accuracy. In this paper, we introduce MinNav, a navigation stack based on optical flow and its uncertainty to fly through a scene that has static and dynamic obstacles and unknown-shaped gaps without any prior knowledge of the scene components and/or their locations/ordering. We further enhance the success rate by employing the activeness of the robot to move around in an exploratory way to find obstacles and navigate through the scene. We successfully evaluate and demonstrate the proposed approach in many real-world experiments in various environments that contain static and dynamic obstacles and unknown-shaped gaps with an overall success rate of 70\%. To the best of our knowledge, this is the first solution to tackle all the aforementioned navigation cases without prior knowledge using a monocular camera. Our approach is on par in performance with methods that utilize metric depth or relative monocular depth with factors of magnitude less computation required and can readily run onboard tiny aerial robots.


Figure: MinNav handles navigation in unstructured and wild scenes including static obstacles, dynamic obstacles and unknown shaped gaps without any prior knowledge of location or scene ordering. All this is performed using a monocular camera and an active strategy using only onboard computation and sensing.


Resources


Paper Github Cite




Perception and Autonomous Robotics Group
Worcester Polytechnic Institute
Copyright © 2024
Website based on Colorlib