Skip to content

State of the art autonomous navigation scripts using Ai, Computer Vision, Lidar and GPS to control an arducopter based quad copter.

Notifications You must be signed in to change notification settings

MohammedElmehdiMehaya/Autonomous-Ai-drone-scripts

 
 

Repository files navigation

Fully autonomous AI powered drone

drawing

This repository pushes to create an state of the art fully autonomous navigation and obstacle avoidance system for multi rotor vehicles. Our approach is based on the novel idea of an fully END-2-END AI model which takes the sensor inputs and directly output the desired control commands for the drone. Currently we are working on creating the necessary code for training and running this approach.

This project also contains an fully autonomous system for tracking a moving target using an Camera and an LiDAR. This sytem uses an AI based object detection model for detecting the target. This system has already been fully tested in real flights and is fully functional!

Autonomous Point To Point Flight Demo

drawing

This video shows the performance of our End-2-End AI based pilot compared to a human pilot on a real flight. The blue dots represent the joystick commands from the human pilot and the red dots represent the joystick commands from our Ai pilot.

Just take a look at how the Ai pilot avoids threes just like the human pilot!

All tooling for gathering, training and validating Ai based pilots for pixhawk based multi rotors are avalible in this repo.

The Autonomous Point To Point navigation system based on End-2-End technology is currently become property of MRR drones. multirotorresearch.com. As MRR we are still dedicated to release awsome Open Source Software. However we do want to protect our software from being used by others to sell. This is why we will be working on creating a Open source version of our AI based pilot which anyone can use for free. However all tooling and method for creating these models will stay secret. But we will publish fully trained models for free!

Autonomous Person Following Flight Demo

drawing

This video shows the performance of our person following system. An AI model detects the person and a PID controller then follows the person.

Hardware

Our quadcopter is fitted with a Jetson Nano running advanced AI models and algorithms real time on the edge on the quadcopter itself. This makes our quadcopter able to work fully autonomsly without the need of a data connection to offload heavy workloads. Because of optimizations our algorithms can work in the computational limited environment of the Jetson Nano.

Drone image

Drone image

Setup

To setup and build the project on a NVIDIA Jetson Nano, please follow the instruction located at BUILD.md. These instructions should guide you towards a fully operational Jetson Nano, including all the required software to start flying autonomously.

Scripts

Follow person

The follow person script can autonomously follow a person using a live RGB camera feed and a MobileNet AI model. This model can detect a person and use the centerpoint to calculate the yaw commands for the drone. roll commands are calculated by using a TF Luna solid state LiDAR. This LiDAR can measure the distance between the person and the drone. Both axis commands are generated using PID controllers.

Usage

Ensure the ArduCopter is in guided mode, then execute the following command:

sudo python3 follow_person_main.py --mode=flight --debug_path=debug/flight1

For ArduPilot SITL simulator testing, you can have to set the mode to test, this can be done by changing the --mode parameter from flight to test.

Press Q to exit the script, the drone will automatically land. Note: this is currently not working in real flight! Switch ArduPilot to Loiter mode and then manually land the drone.

Data plotter

A script was also made to plot the debug data from the PID controllers.

Usage

Execute the following command to debug the data from the PID controllers:

python3 data_plotter.py

Ensure the file name in the script is set to the correct debug file. This can be done by editing the data_plotter.py script.

keras to tensorRT model

For faster execution tansform keras model to tensorRT. (models in repo have been converted already!) Use tf2onnx tool for this.

python -m tf2onnx.convert --saved-model tensorflow-model-path --output model.onnx

Note

This project is still under heavy development. All code is experimental so please use the code with caution! We are not responsible for damage to people or property from the usage of these scripts.

About

State of the art autonomous navigation scripts using Ai, Computer Vision, Lidar and GPS to control an arducopter based quad copter.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.0%
  • Shell 1.0%