| Deepak Singh*1 | Shreyas Khobragade*1 | Nitin J. Sanket1 |
* Equal Contribution
1 Perception and Autonomous Robotics Group (PeAR)Autonomous aerial navigation in absolute darkness is crucial for post-disaster search and rescue operations, which often occur from disaster-zone power outages. Yet, due to resource constraints, tiny aerial robots, perfectly suited for these operations, are unable to navigate in the darkness to find survivors safely. In this paper, we present an autonomous aerial robot for navigation in the dark by combining an Infra-Red (IR) monocular camera with a large-aperture coded lens and structured light without external infrastructure like GPS or motion-capture. Our approach obtains depth-dependent defocus cues (each structured light point appears as a pattern that is depth dependent), which acts as a strong prior for our AsterNet deep depth estimation model. The model is trained in simulation by generating data using a simple optical model and transfers directly to the real world without any fine-tuning or retraining. AsterNet runs onboard the robot at 20 Hz on an NVIDIA Jetson OrinTM Nano. Furthermore, our network is robust to changes in the structured light pattern and relative placement of the pattern emitter and IR camera, leading to simplified and cost-effective construction. We successfully evaluate and demonstrate our proposed depth navigation approach AsterNav using depth from AsterNet in many real-world experiments using only onboard sensing and computation, including dark matte obstacles and thin ropes (diameter 6.25mm), achieving an overall success rate of 95.5% with unknown object shapes, locations and materials. To the best of our knowledge, this is the first work on monocular, structured-light-based quadrotor navigation in absolute darkness.

| IEEE RA-L Paper |
@ARTICLE{11346995,
author={Singh, Deepak and Khobragade, Shreyas and Sanket, Nitin J.},
journal={IEEE Robotics and Automation Letters},
title={AsterNav: Autonomous Aerial Robot Navigation In Darkness Using Passive Computation},
year={2026},
volume={},
number={},
pages={1-8},
keywords={Apertures;Navigation;Cameras;Autonomous aerial vehicles;Sensors;Robot sensing systems;Lighting;Robot vision systems;Robots;Quadrotors;Aerial Systems: Perception and Autonomy;Vision-Based Navigation;Deep Learning for Visual Perception;Coded Aperture;Darkness;Low-light;Quadrotors;Deep Learning;},
doi={10.1109/LRA.2026.3653388}}

Perception and Autonomous Robotics Group
Worcester Polytechnic Institute
Copyright © 2024
Website based on Colorlib