In this project a perception stack for the DJI Tello EDU quadcopter is designed to enabe it to autonomously navigate through irregular, unknown-shaped windows. The primary goal is to identify and fly through the largest gap in a wall. The process begins by maneuvering the quadcopter to a position where the full gap is visible, followed by detecting the optical flow of the window. Subsequently, this flow data is postprocessed to outline the largest gap’s contour and pinpoint its center. With the center identified we employ visual servoing to guide the quadcopter to align its image center with the gap’s center, facilitating a successful flight through the gap.
(Check the full problem statements here project 4)
- Install Numpy, OpenCV, djitellopy, torch, cudatoolkit, matplotlib libraries before running the code.
- Install all the library dependencies mentioned here
- Turn the drone on and connect to it.
- To run the main code run the
Wrapper.py
file after installing all dependancies. This will save the final output inCode
folder itself. - In Code folder:
python3 Wrapper.py --model=RAFT/models/raft-sintel.pth
- In our testing we found the weights for sintel dataset are giving better results. Try other weights if you want to by changing the weight file accordingly.
For detailed description see the report here.
Testing the flow detection in blender simulation:
Case 1:
Case 2:
Gaussian splat of the real window in the lab:
Live demo runs:
Watch the good quality video of demo run 1 on the real tello drone here (link1 and link2).
Watch the good quality video of demo run 2 on the real tello drone here (link1 and link2).
Chaitanya Sriram Gaddipati - [email protected]
Shiva Surya Lolla - [email protected]
Ankit Talele - [email protected]