Teaching Drones to Find Safe Routes After Natural Disasters
As part of my Masters in Robotic Engineering from Worcester Polytechnic Institute, I led our team's capstone project: an autonomous system to assist first responders after natural disasters by detecting road obstructions.
The system used UAV drones as agents that learned from camera, depth, and IMU data to explore roads through curriculum-based reinforcement learning.
Simulation and Learning Architecture
We used AirSim to simulate quadcopter physics and sensor data. For high-level exploration, I wrote a Double DQN from scratch using a MobileNet backbone with pretrained weights. The input channels were expanded from 3 (RGB) to 7 to provide additional state information. A segmentation model identified environmental features including roads, power lines, and hazardous debris.
You can see the drone mapping roads in the demo below:
Route Planning
Once debris was identified, the drone relayed GPS coordinates to first responders. RRT* then updated the route plan based on the new obstacles.
Acknowledgments
This work was advised by Professor John Nafziger and Collins Aerospace's Applied Research and Technology center in Cork, Ireland.
Prior Work
This project built on my earlier coursework in multi-agent RL for non-convex obstacle navigation in collective transport. That work used emergent local communication to coordinate swarms of minimalistic robots moving objects around obstacles. The curriculum-based RL techniques I developed there directly contributed to the success of this capstone.