This work involves the application of neural networks to program drones and thus test their performance in such systems. It will be implemented in two applications: initially, a line-following application, where the drone must follow a line in various circuits with the least possible error; and simultaneously, it will have to cross windows, which will consist of a mixed control where the drone will be teleoperated, but if the pilot wishes, the neural network will take control of the drone and attempt to cross the gates that appear in its field of vision.
Usually, classical programming algorithms are used for drone control. The control of these drones is typically done with cameras and various sensors that require intensive processing. This means that the drone needs a processing unit as a payload to handle all the data, or alternatively, the data must be processed at the ground station, which adds a delay in communications and potential errors that could affect a system that is critical to operate in real-time.
This project has two packages: one contains all the drone platforms, and the other contains the behaviors programmed into the drone_behaviors packages. The drone is able to perform two applications: the line-following application and the gate-traversing application.
We follow the following repository distribution for machine learning projects, along with the basic ROS structure.
The models and the dataset occupy significant space, so it was decided to use hugging Face to host all this data. Thus, a repository was created for the dataset and another for the models
We used de aerostack2 platforms to utilize all the programmed behaviors in multiple drones. The platform launchers are in the package drone_platforms This package was created to separate the platform used from the behavior, thus allowing the developed software to be used in real drones.
- For launching follow line world:
# Launch default circuit
ros2 launch drone_sim_driver as2_sim_circuit.launch.py
# Launching with arguments
ros2 launch drone_sim_driver as2_sim_circuit.launch.py world:=/PATH_TO_WORLD/NAME.world yaw:=3.14
- For launching the gates world:
# Execute if you want to change to a random scenario
python3 python3 generateGateWorld.py
ros2 ros2 launch drone_sim_driver as2_sim_gates.launch.py
Since this expert pilot will be autopiloted, the following control configuration was chosen to teleoperate the drone:
#! First launch the platform
## out_dir = Path where all the execution data will be stored.
## net_dir = PilotNet model path
## deep_dir = DeepPilot model path
ros2 launch drone_behaviors remoteControl.launch.py out_dir:=PATH net_dir:=PILOT_NET_MODEL deep_dir:=DEEP_PILOT_MODEL
Click the nex link for the guide of the follow line application.
For the imitation learning validation in this exercise, we collected a dataset with an algorithmic pilot. Clicking the next image you can watch the demo video:
In this application we collected two datasets, one more generic for the pilotNet training. The results of this application where successful too:
In this video, you can see the drone maintaining a constant altitude:
In this video, you can see the drone adjusting its altitude: