Skip to content

1. System overview and software structure

Emma edited this page Oct 1, 2024 · 18 revisions

System goal: SUAS Competition

At the time of writing, the SUAS 2025 official rules haven't been released yet, and so the following section is subject to change. The SUAS competition is designed to foster interest in Unmanned Aerial Systems (UAS), stimulate interest in UAS technologies and careers, and engage students in a challenging mission. The UAS is generally composed of a ground station and an Unmanned Aerial Vehicle (UAV, aka drone). The mission specifics change every year, but in general, the mission tasks require that:

  1. The UAV must be able to autonomously perform a lap in the airfield, staying within the mission boundary, and avoiding obstacles such as trees and the other UAV
    • There may be up to one other UAV in the airfield at the same time
    • The mission boundary is revealed within the rules, published on October 1st
    • The mission lap waypoints are revealed during mission setup time, the day of the competition (around June)
  2. The UAV must perform airdrops on specific targets autonomously
    • Targets are objects such as paper cutouts of different shapes or mannequins, found on the ground within a survey zone.
    • Like the mission boundary, the coordinates of the survey zone are revealed within the rules, published on October 1st.
    • The payload, which is a strobing beacon (exact details of this will be in the rules), must reach the target undamaged (lowered gently).
  3. There are also the following constraints
    • The UAV must perform a lap before performing the airdrop.
    • The UAV must stay within a height range at all times outside of the takeoff area
      • the height range is 50 to 400 feet Above Ground Level (AGL)
    • There are 4 airdrops to perform to get the maximum amount of points.
    • There is a time limit for the mission.
      • Going above that time limit will lead to penalties, but is allowed
      • In 2024's SUAS rules, the time limit for the mission was 30 mins
    • There is a limit to the number of people allowed to interact with the drone during the mission (Operators).
      • Having more operators will lead to penalties, but is allowed
      • In 2024's SUAS rules, the number of operators allowed without penalties was 2

There are other parts to the competition that are not the mission, such as the flight review or the mission setup, which would be too long to mention and not too relevant for the software team.

More information on the competition and the rules can be found on the SUAS website and I strongly recommend that you read through the rules as they will dive into the rest of the details and cover other critical parts that the above points don't mention. They also keep a log of past competitions with YouTube playlists for each year's set of flight review videos.

Introduction to autonomous UASs

drone system drawio A UAS is generally composed of 2 parts, the UAV (aka the drone) and the ground station.

  1. UAV
    • In general:
      • Flight-related sensors, such as GPS, airspeed sensors, etc.
      • Flight-related actuators, such as motors, servos, etc.
      • Other flight-related components such as batteries, Electronic Speed Controllers (ESCs), telemetry radio, RC receiver, etc.
      • Flight Controller, a microcontroller flashed with the autopilot of choice
        • The flight controller is the hardware, the autopilot is the software (also called the firmware)
        • The actuators and sensors mentioned above connect to the flight controller
        • The autopilot on the flight controller interprets sensor data which it uses to monitor the drone
          • It also sends telemetry information to the ground station through the telemetry radio (Using the MAVLink protocol)
        • The autopilot on the flight controller then controls the actuators to satisfy its current goal (such as taking off, holding a position, going to a position, etc.)
        • We will use CUAV pixhawk as our flight controller
        • We will use PX4 as our autopilot
    • For autonomous flight:
      • Autonomous flight is achieved by having a computer onboard the UAV, called the offboard computer (since it's offboard for the flight controller) or a companion computer, which sends position commands to the flight controller.
      • Thus, without the need for human input, the UAV will be able to perform waypoint-to-waypoint navigation.
      • The offboard computer is connected to the flight controller to establish a communication link that allows it to monitor the state of the drone and to give the flight controller commands.
        • This is done through the TELEM2 port of the flight controller and the TX/RX pins of the companion computer for the PX4 autopilot, Here is a tutorial
      • The companion computer is where our code will be running
      • By having the companion computer and the ground computer on the same Local Area Network (LAN), we could ssh into the companion computer, to either start our code, see debug messages, or do whatever we want to do from the terminal
    • For our specific needs:
      • We need to do things such as avoiding obstacles, detecting targets and performing airdrops
      • This will require additional hardware, that will connect to the companion computer, as they do not relate directly to the task of flying to a given waypoint.
      • This additional hardware may be cameras, lidar sensors, or an extra microcontroller, whose task is to give the companion computer a higher-level interface for the airdrop mechanism.
  2. Ground Station
    • In general:
      • QGroundControl is a 3rd party program that shows standard UAV data in a graphical form, things like the drone's position, the mission boundary, etc.
        • A ground telemetry module, connected to QGroundControl, establishes a communication link to the UAV's flight controller.
        • Pre-flight, helps set up various flight controller settings, such as max pitch rate, maximum horizontal velocity, etc.
        • Pre-flight, facilitates uploading a mission (waypoints, with a mission boundary) to the flight controller.
        • Shows the UAV's position, velocity, and other information that may be relevant
        • Send UAV commands
    • Other information that is specific to our project will require custom software to showcase, things like the current target or the current state of the airdrop controller (DROPPING or IDLE).
    • Ideally we would merge both general and use case-specific information into one GUI, but such a task is not necessarily required and may be more trouble than it is worth
      • If this is of interest to you, please let me know (Imad, the software lead)
    • We may also want to ssh into the offboard computer, to run whichever script starts the program and to get more control over the offboard computer. This is possible as long as the ground computer is in the same LAN as the offboard computer.
    • A safety pilot is also an essential part of a ground station, which is a person capable of taking over control of the UAV and capable of flying it safely. This is done through an RC transmitter

Drone 2024's software structure

Start of the year

In 2024, the drone team wanted to make a VTOL UAV, a drone capable of both fixed-wing and quadcopter flight. This was scrapped much later in the year for a quadcopter-only UAV. Thus, there may be a discrepancy between the following concept model and the code you will find in the drone 2024 repo. We also had to abandon obstacle detection as lidar is not the best sensor for such a task on a UAV. Concept Model drawio The fancy diagram above showcases the various parts of the program needed to satisfy the specific use case, shown as the rectangles, the way they connect/communicate with each other, shown as the arrows/edges, and the external actors/facilitators, shown as the little stick people.

End of the year

In the end, however, it looked more like the following diagram: ros2 drawio This diagram represents the various ROS 2 nodes at play and their topics/services. All the nodes are running on the companion computer, the little stick people represent code/entities that are either not software or don't run on the computer. If you have never used ROS before, you may have trouble understanding the diagram, see the tutorial section of ROS 2 before trying to make sense of it. The circles represent nodes and the boxes around them are the package that contains them in the drone 2024 repository. The parallelograms are topics/services, in which an incoming arrow means publishing to the topic, and an outgoing arrow means subscribing to the topic. Non-ROS nodes that may subscribe/publish to a topic or other hardware are represented as little stick people. The types used by each topic can be found either in px4_msgs if they relate to px4's MicroXRCE topics, custom_msgs if they are use case specific, or the various message types that come by default with ROS. Here is a brief description of the nodes:

  1. offboard_node implements an action queue, where an action can be either takeoff, land, or waypoint. It executes the actions in a FIFO order (since it is a queue) and actions can be enqueued/dequeued through 2 services. It also collects information about the drone that may be relevant and packages it into one custom_msgs type that it then publishes on a topic. This effectively makes it possible to separate the autopilot from the rest of the codebase, by making the offboard_node's topics form an interface for the rest of our node to interact with without worrying about the actual implementation. The offboard_node interacts with the flight controller's autopilot thanks to a MicroXRCE agent, which is an application that exposes PX4's uORBs (The inner "topics" that PX4 uses) as ROS compatible topics. Monitoring the drone and giving it commands is then as simple as subscribing/publishing under the right PX4 topic. This sadly only works for a quadcopter UAV, as VTOL UAVs require higher level functionality, such as transitioning from quadcopter to fixed wings flight, only exposed through MAVLink.
  2. local_planner is a VFH+ planner that takes in a goal waypoint through a topic. It then tries to reach the goal without going outside a boundary, set through the boundary_setter topic. Originally, this node would also avoid obstacles but that had to be abandoned because of lack of hardware and time.
  3. mission is a finite state machine style node that needs to know the various mission details that are required to perform the mission. Those details are set up through the various topics it's subscribed to, that a user needs to publish to. It keeps track of the current phase of the mission it's at, such as performing an airdrop, returning home, or performing a waypoint, etc., and publishes to the relevant topics for each state. It waits for the user to publish under the /start topic to change state from idle and thus begin the mission.
  4. dummy_cv is the computer vision node, connected to a camera that points towards the ground. It awaits for the mission node to publish the current target, only done in the survey zone state of the mission. Once it receives a target, it opens the camera and starts performing a search for the target on each frame it captures, looking for the right shape, shape color, letter and letter color, or the mannequin. The shape, letter, and mannequin are found through 2 YOLOv8 models trained on that task. The color detection is done through various OpenCV utilities and the HSV color space. Once the target is found on the image, it calculates the target's position on the ground and publishes it back to the mission node, then it clears its current target value, effectively going back to idle.
  5. The airdrop controller while not shown as a node is a micro ROS node (not a ROS node) which is running on an Arduino Due. It is subscribed to a topic that tells it to arm all the hubs of the airdrop mechanism and a topic that commands it to drop the bottle on a specific hub.

This year's software structure

To come :D