Hrishikesh Pawar*1 | Deepak Singh*1 | Nitin J. Sanket1 |
* Equal Contribution
1 Perception and Autonomous Robotics Group (PeAR)Navigation in cluttered, unstructured scenes is crucial for deploying aerial robots in humanitarian applications. To enhance efficiency and extend operational times, we propose a biologically inspired method called passive computation. By utilizing wave physics, we extract depth cues from defocus instead of relying on costly explicit depth computation. We demonstrate this approach by using a large aperture lens to get a shallow depth of field on a monocular event camera, enabling robust and parsimonious navigation through depth ordinality. The key idea is to optically "blur out" regions of disinterest, minimizing computational demands. In simulation experiments, our method achieved a success rate of 70% with over 62x computation savings compared to state-of-the-art techniques. Preliminary results on a real setup also show promise, highlighting the potential of defocus in enhancing event-based navigation for aerial robots.
Paper |
Perception and Autonomous Robotics Group
Worcester Polytechnic Institute
Copyright © 2024
Website based on Colorlib