Blurring For Clarity : Passive Computation for Defocus-Driven Parsimonious Navigation using a Monocular Event Camera



Hrishikesh Pawar*1 Deepak Singh*1 Nitin J. Sanket1


* Equal Contribution

1 Perception and Autonomous Robotics Group (PeAR)
at
Worcester Polytechnic Institute


Abstract


Navigation in cluttered, unstructured scenes is crucial for deploying aerial robots in humanitarian applications. To enhance efficiency and extend operational times, we propose a biologically inspired method called passive computation. By utilizing wave physics, we extract depth cues from defocus instead of relying on costly explicit depth computation. We demonstrate this approach by using a large aperture lens to get a shallow depth of field on a monocular event camera, enabling robust and parsimonious navigation through depth ordinality. The key idea is to optically "blur out" regions of disinterest, minimizing computational demands. In simulation experiments, our method achieved a success rate of 70% with over 62x computation savings compared to state-of-the-art techniques. Preliminary results on a real setup also show promise, highlighting the potential of defocus in enhancing event-based navigation for aerial robots.


Figure: Proposed approach in a real-world scene with a hand-held camera setup. Obstacle will appear as yellow while the background will be blue in the sharpness maps. The green frame (left) shows tree positioned farther from the focus distance. The red frame (right) captures tree positioned near camera’s focus distance. For each frame, the corresponding events and sharpness map (depth cues) are shown on the either side of the image. ​


Resources


Paper




Perception and Autonomous Robotics Group
Worcester Polytechnic Institute
Copyright © 2024
Website based on Colorlib