Skip to content

UCSD-ECEMAE-148/fall-2024-final-project-team-3

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

49 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

The Tank


Logo

ECE/MAE148 Final Project

Team 3 Fall 2024

IMG_3811

Table of Contents
  1. Team Members
  2. Final Project
  3. Robot Design
  4. Acknowledgments
  5. Authors
  6. Contact

Team Members

Agasthya, Harshit, Minjun, Purab

Team Member Major and Class

  • Agasthya - Mechanical Engineering, Ctrls & Robotics (MC34) - Class of 2025
  • Harshit - Mechanical Engineering, Ctrls & Robotics (MC34) - Class of 2025
  • Minjun - Electrical Engineering (EC27) - Class of 2026
  • Purab - Computer Engineering (EC26) - Class of 2026

Final Project

Our project goal was to develop a car that follows a set path based on GPS cordinates searching for certain objects and shooting it with a laser. We aimed to create a project that could be useful in the defense industry creating autonomous tanks and other vehicles to save lives of military personal. Our project is developed using ROS2 packages that works with the UCSD Robocar framework to be able to control the servos in charge of aiming and shooting the laser, as well as have the car run on its set path using GNSS.

Original Goals

  • Automated Path Setup
    • Develop a package dedicated to extracting the car's path, converting it to corresponding .csv datasets for route mapping and navigation.
      • Upon launch, the car will follow a path based on data obtained from the GNSS, adhering to its route.
      • Adjust the PID variables to correct overcorrection and undercorrection when the car deviates from its path.
      • Implement real-time monitoring and adjustments to ensure the car stays on course.
  • Color Recognition
    • Utilize the camera to obtain a real-time view of the environment and track colors.
      • Determine the coordinates from the center when a color is detected.
      • Once color detection is functional, work on detecting specific objects.
      • Integrate object detection algorithms to enhance the accuracy and reliability of the system.
  • CAD Design
    • Design a servo mount capable of rotating along the X and Y axes.
      • Create a mount attached to the servos to enable pointing at different locations.
      • Ensure the design is robust and can withstand the operational stresses of the system.
      • Optimize the mount for ease of assembly and maintenance.
  • Servo Motor Control
    • Employ the adafruit_servokit library to move the servos and aim the laser along the X and Y axes.
      • Use coordinates received from the color or object detection model to move the servo and point at the specific object.
      • Implement precise control algorithms to ensure accurate targeting.
      • Test and calibrate the servo movements to achieve optimal performance.
  • Laser Control
    • Activate the laser when the object is detected and the laser is aimed at the object.
      • Ensure safety protocols are in place to prevent accidental activation.
      • Integrate feedback mechanisms to confirm successful targeting and firing.
      • Conduct thorough testing to validate the system's reliability and effectiveness.

Goals We Met

See README section in our src directory for a breakdown of how our packages run together

Our Hopes and Dreams

Stretch Goal 1

  • Complete package integration with ROS
    • We would like to have ROS integration because it provides a modular and scalable framework for developing robotic applications. ROS allows for seamless communication between different components, enabling more robust control of the car. By integrating ROS, we can enhance the system's modularity, making it easier to implement additional features, debug issues, and ensure compatibility with other robotics projects in the future.

Stretch Goal 2

  • LiDAR
    • Since our car is only driving using GPS, there are no object avoidance capabilities. LiDAR will allow us to detect obstacles on the path and navigate around them, returning to the GPS-defined route. Incorporating LiDAR enhances the system's situational awareness, making it more adaptable to dynamic environments. This development is inspired by existing projects like autonomous vehicles, which combine GNSS with LiDAR to achieve precision and safety.

Final Project Documentation

Robot Design

CAD Parts

Custom Designed Parts

Part CAD Model
Camera Mount Top
Servo Mount
Acrylic Base
Wire Enclosure Top
Wire Enclosure Bottom

Electronic Hardware

Below is a circuit diagram of the electronic hardware setup for the car.

Screenshot 2024-12-10 173120

Software

Embedded Systems

To program the Jetson Nano, we accessed the Jetson Nano through a remote SSH connection to an embedded Linux system onboard and ran a docker container with all the necessary dependencies to run our packages. This allowed us to eliminate any incompatibility issues and to maximize resource efficiency on the Jetson. Because of our dependencies, we ran into issues importing all of our libraries together such as cv2 and the adafruit servo library. These issues were resolved when we created a separate python environment rolling the version back for it to all work together.

ROS2

The base image we pulled from Docker Hub for our project development included the UCSD Robocar module, which runs on a Linux OS (Ubuntu 20.04). This module, comprising several submodules utilizing ROS/ROS2, was initially developed by Dominic Nightingale, a graduate student at UC San Diego. His framework was designed to work with a wide array of sensors and actuation methods on scale autonomous vehicles, allowing for easy control of a car-like robot while enabling it to perform autonomous tasks simultaneously.

DonkeyCar AI

For our early quarter course deliverables, we used DonkeyCar to train a car to drive autonomous GPS laps around a track in a simulated environment. We utilized GNSS to record the coordinates on the track and then trained the car with this data to race on a remote server. This process was instrumental in enabling our car to run autonomously on set paths. By simulating the environment and using the recorded data, we were able to refine our car's navigation capabilities and ensure it could follow predetermined routes accurately.

For path following, we used the DonkeyCar AI framework and tuned our own PID values. With the DonkeyCar framework, we connected through GPS and used PID following of waypoints for the car. This allowed us to achieve precise control and ensure the car stayed on its designated path.

License: MIT

Acknowledgments

Thank you to my teammates, Professor Jack Silberman, and our incredible TA's Winston and Alexander for an amazing Fall 2024 class! Thank you Kiersten and Alexander for the amazing readme template.

Authors

Contact

About

fall-2024-final-project-team-03 created by GitHub Classroom

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published