+
+ Our first step in this project was to collaboratively ideate based around all of our
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Learning Goals
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ . We came up with many promising and ambitious ideas, including an autonomous hot-air balloon, an automatic drumming robot. Eventually, we settled on an autonomous waiter robot that could navigate a human space to deliver food. We wanted to utilize the ROS2 architecture to control it, and we wanted it's end-effector to be modularly swappable.
+
+
+
+
+ After deciding on a building a waiter robot, we defined our high-level goals for this project. We outlined our mechanical systems, including the drivetrain, and interchangeable module system, as well as our software systems, including the ROS2 topics, nodes, and modules that sense the world through sensors and interact with the world through motors. Then we got to work!
+
+
+
+
+ Electrical/Firmware Progress:
+
+
+
+ On the electrical side, we did hand calculations to estimate our drivetrain motor requirements, and using these estimates, selected suitable motors. Using the electrical specifications of these motors, we selected a battery that could power them, with a margin of error large enough for an additional module motor. With these critical components selected, we ordered motor drivers, as well as a buck converter to power our Raspberry Pi and Arduinos from the battery. Once the components arrived, we assembled a barebones power distribution network for testing.
+
+
+
+
+
+
+ Software Progress:
+
+
+
+ While waiting for these components to arrive, we examined the possibility of using microROS with Raspberry Pi Picos, but determined that the learning curve would be too steep to be able to meet our ambitious project goals, so instead we decided to pivot to using Arduino Unos as our microcontrollers. We also performed all of the computational setup for our Raspberry Pi, including downloading ROS2 packages, setting up a workspace, and assigning it a static IP address on the OLIN-ROBOTICS network. This included extensive setup for getting the Raspberry Pi to communicate with our Picos over MicroROS, although this wasn’t a part of our future system.
+
+
+ We established our codebase on Github and created a node for teleoperating the robot, which would later help us test the robot’s drivetrain hardware integrity. We created a motion execution node which performed basic proportional correction in the linear and angular dimensions to get from its current location to an arbitrary pose elsewhere. Finally, we began testing an open-source AprilTag recognition package developed for ROS2 systems like ours.
+
+
+
+
+
+
+ Mechanical Progress:
+
+
+
+
+
+
+ We created the preliminary CAD for our robot frame, chassis and drivetrain. We settled on having two driven wheels mounted along the center axis of the robot, so that we could have only two driven wheels and still turn in place. We also sourced some caster wheels to use to keep the robot balanced. After getting our CAD finalized, we began sourcing the other materials we would need.
+
+
+
+
+
+
+ Demo and Debrief:
+
+
+ At the end of Sprint 1, we presented our progress to the rest of the class. We demonstrated our complete chassis, our power distribution system, and our AprilTag recognition through ROS. After our sprint review, we met as a team to reflect on our progress and team structure, and set goals for the upcoming sprint.
+
+
+ Our goals for Sprint 2 were to finish constructing the drivetrain, design the tray module and modular mechanical output system, start writing libraries for various robot sensors like the buttons and encoders, and develop and implement the architecture for our robot’s localization and navigation system.
+
+
+
+
+
+
+
+
+
+
+
+
+ Sprint #2 -
+ (11/06/24 - 11/19/24)
+
+
+
+ During Sprint 2, we continued to develop our robot’s mechanical, electrical, and software systems.
+
+
+
+ Electrical/Firmware Progress:
+
+
+
+ To test our drivetrain and power distribution system, we wrote motor firmware that receives motor commands from the Raspberry Pi and sends PWM signals to the motor drivers. In order to let users interact with Walter, we added buttons and wrote firmware to support them. We also started writing firmware to calculate linear and angular velocity from both motor encoders to provide real time feedback from the motors to the software system. For both the buttons and encoders we wrote custom Arduino libraries that were optimized for use on Walter.
+
+
+
+
+
+
+ Software Progress:
+
+
+
+ On the software side, we started to model our system as “state estimation” and “path-planning”. We settled on a path-planning system that would localize itself on a reference map and then navigate to predefined locations in the world. We developed a mapmaker node that utilized the ROS2 TransformManager tool to define static relationships between various key locations in the world, such as known AprilTag landmarks and table destinations. We also developed a serial adapter node that parsed serial communication between the microcontrollers and the ROS2 network, in place of using MicroROS. Finally, we began testing navigation software in simulation using the ROS2 command-line tools.
+
+
+ Progress on state estimation included extensive testing of the AprilTag package alongside a new tool, [rpi camera, v4l2 package testing]. While visual pose estimation was in development, we built a dead reckoning-based pose estimation system that allowed us to test our navigation software. This development, alongside our path-planning progress, allowed us to finally start testing the drivetrain hardware integrated with autonomous navigation right at the end of Sprint 2. This prepared the software team for a Sprint 3 that was almost entirely testing and debugging in the real world.
+
+
+
+
+
+
+ Mechanical Progress:
+
+
+
+
+
+
+ On the mechanical side, we completed our robot’s drivetrain and frame, and began work on the tray module. Using ½” plywood, and some basic woodshop tools, we cut out four panels - three walls and a top - and mounted them to our 1010 aluminum chassis using T-Nuts.
+
+
+
+
+
+
+
+
+ Once we had a top surface to attach something to, we then worked to get a draft of the tray module created. We designed a four-bar linkage, consisting of laser-cut acrylic links mounted with 3D-printed brackets. The links connected to an aluminum shaft held in place by bearings in 3D-printed mounts, allowing it to rotate freely. We mounted the tray, a laser-cut ⅛” plywood sheet, to the top of the four-bar.
+
+
+
+
+
+ Demo and Debrief:
+
+
+ At the end of Sprint 2, we presented our sprint review to the rest of the class, complete with a live demo of the robot being tele-operated to a sitting person, and the tray module being manually extended to deliver a bowl of fruit. Following our sprint review, we again met as a team to reflect on our team structure, organization, and our goals and deadlines for the upcoming final sprint.
+
+
+
+
+
+
+
+
+
+ Sprint #3 -
+ (11/20/24 - 12/10/24)
+
+
+
In our final sprint, the final robot took shape as we completed and integrated all of our various systems.
+
+
+
+ Electrical/Firmware Progress:
+
+
+
+ On the electrical side we picked a module motor and motor driver. Using the geometry of the four-bar mechanism, we did hand calculations to estimate the torque required of the motor. From the results of these calculations, we decided to use a stepper motor for precise positional control, and holding torque. We also added an inertial measurement unit, or IMU, and motor encoders to the sensor Arduino to allow for the software to make a more accurate pose estimate.
+
+
+ With the tray module assembled, we wanted to be able to detect the presence of a plate on the tray so we integrated strain gauges into the tray design. Each strain gauge was a wheatstone bridge, and to amplify this to readable levels, we built a subtracting operational amplifier (OpAmp) for each strain gauge.
+
+
+
+
+
+
+
+
+ Software Progress:
+
+
+
+ On the software side, we split up our “goal-driver” node, which managed error correction, driving, and path-planning, into two nodes such that path-planning was totally independent. The new path planning node included a priority queuing system for new destinations, and multiple configurations for how it would navigate between them. We also developed a new module node that actuated the four-bar and integrated it into our routing routine. The robot would now actuate the four-bar upon destination arrival and only start navigating towards a new goal when the four-bar had been retracted.
+
+
+ We confirmed the functionality of our software pipeline in simulation, and also tested it on an alternate platform (a modified vacuum bot) to show off the system embodied. This allowed us to eliminate software errors while the physical robot was not yet ready for integrated testing. We also began developing several launch files that allowed us to run tests of systems in isolation, rather than bringing up the full software network every time we tested something.
+
+
+ We continued developing our own node to convert pose estimates in relation to an AprilTag into the world frame; however, we had a lot of trouble solving the math required of this transformation. As a result, all footage of robot navigation before the end of Sprint 3 involved dead reckoning rather than landmark pose estimates.
+
+
+
+
+
+
+
+
+ Mechanical Progress:
+
+
+
+
+
+
+
+
+ On the mechanical side, we worked to design the module interfacing system and adapt the existing four-bar tray to the module interfacing system. We first designed a dovetail joint inspired mechanism that has two parts: the module housing (which is attached to the robot frame), and the module receiver (which is part of a given module, and slides into the module housing). When the module receiver slides in the module housing channel, two gears mesh and transfer rotational power from a stepper motor. This allows a given module to be swapped without any fasteners. The module housing and receiver were both 3D printed using dissolvable support material due to the internal geometry required to transfer power in this way.
+
+
+ The four-bar tray is actuated by the stepper motor through a vertical shaft from the module receiver and a set of bevel gears that actuate the linkage.
+
+
+ We also designed and CAD-modeled a potential second model to serve drinks, which would rotate a carousel designed to hold standard 12oz beverage cans. Due to time constraints, we did not end up building this module.
+
+
+
+
+
+
+
+ Post Sprint 3:
+
+
+ Following our final sprint review, we still had some final things to work out with our robot. One major issue was a delay in communication between the Arduinos and the Raspberry Pi. This delay meant that our software routines would become less accurate and slow down any testing.
+
Walter the WaiterBot is a modular waiter robot designed for use in restaurants. With the press of a button, he can autonomously navigate to tables, deliver food, and return to the kitchen. He uses a ROS2 node network to manage tasks like localization and actuation, which also communicates with Arduinos to send motor commands and receive sensor data. He has a modular mechanical output that seamlessly powers different modules, all of which are capable of being swapped out by hand – no rewiring or wrenches required!
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Why a Modular System?
+ A human waiter performs various different roles in a restaurant, from delivering dishes to taking orders. While a human can switch between tasks seamlessly, a robot has to be intentionally designed with each desired function in mind. Rather than try to create an all-purpose design that could handle any desired task with a single mechanical system, we opted to create an interchangeable module system, where modules for different tasks could be easily interchanged by a user. If a new function is desired, a new module could be created and swapped in. We decided to develop a tray module for this project because it is one of the most basic requirements of a waiter, and one of the most versatile.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Why ROS2?
+ We chose to use ROS2 for two reasons: real-time capability and modularity. ROS2’s node and message-passing system allowed us to isolate computationally heavyweight tasks from tasks that require real-time (or close) capabilities. Additionally, the modularity of the node network allowed us to seamlessly refactor a particular function of the robot without modifying any other piece of the system.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Why Localization?
+
+ We chose to implement a localization technique rather than just having Walter drive towards goals because our software team wanted experience solving real-world robotics problems. Localization is a classic challenge that any system in the real world will need to solve elegantly.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Why Arduino Microcontrollers?
+ We chose to use Arduino UNOs as our microcontrollers because they provided a low barrier of entry to writing firmware for both motors and sensors, while also giving us the opportunity to write complex Arduino libraries in C++. By choosing to use Arduinos, we developed high level firmware skills, and got an introduction to firmware development in C++.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Why Did We Name Him Walter?
+ We wanted him to have a professional and formal appearance. From the name to the bow tie, Walter looks to be a high-status waiter at a five-star restaurant – and he takes his job very seriously.
+
I’m a sophomore at Olin College of Engineering, majoring in Mechanical Engineering. My main learning goal for this project was to gain experience designing modular mechanical systems for an autonomous robot. I primarily developed the robot’s tray module, which uses a four-bar linkage connected to the robot’s module interface system.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
Bill Le
+
I’m a sophomore at Olin College of Engineering majoring in Engineering with Robotics. I love League of Legends. My learning goals for this project was to write a complete and elegant ROS2 system. The two main things I worked on are the four-bar module control and all things Apriltags related up to getting pose estimates.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
Felix Halaska
+
I’m a sophomore at Olin College of Engineering majoring in Mechanical Engineering with Robotics. I used this project to broaden my skillset by learning how to spec motors and write firmware in C/C++. I designed the electrical system, and wrote all the firmware.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
Quinn Verrill
+
I am a sophomore at Olin College of Engineering studying Mechanical Engineering. My learning goals for this project were to create a complex mechanical system and refine my designing and tolerancing skills. For this project I worked primarily on Walter’s chassis/frame and the module interfacing system.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
Ivy Mahncke
+
I’m a sophomore at Olin College studying Robotics Engineering! I used PIE as an opportunity to build a complex autonomous system entirely in ROS2, with a focus on elegant system design. I led the development of Walter’s navigation system.
The electrical system consists of the power distribution system, as well as the sensor wiring on both the chassis and the tray module.
+
+
+
+
+
+
+ Power Distribution:
+
+
+
+
+
+ The robot is powered by a 12v 3000mAh battery, which is distributed to two 12v DC drivetrain motors, a stepper motor for the module, and a Raspberry Pi through a buck converter. The Raspberry Pi in turn powers all three Arduino UNOs over USB. Both drivetrain motors are controlled using PWM signals from an Arduino and a 13A motor driver. The stepper motor is controlled by an Arduino through a stepper motor driver.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Chassis Sensors:
+
+
+
+ On the chassis we had three buttons, two motor encoders, and an inertial measurement unit (IMU). All these sensors are powered by an Arduino Uno. The buttons and their integrated LEDs are wired to digital pins so they can be controlled and read. The encoders are wired to interrupt pins so the firmware can detect when there is a change in voltage. The IMU is wired to the SCL and SDA pins on the Arduino, and is mounted at the center of rotation of the chassis.
+
+
+
+
+
+
+
+
+
+
+
+
+ Tray Module Sensors:
+
+
+
+
+
+ On the tray module, we have four strain gauges, which are powered by a separate Arduino Uno. They are each the variable resistor in a wheatstone bridge which is amplified by an operational amplifier (OpAmp) with a gain of 1000 before being wired to analog inputs on the Arduino.
+
+
Our firmware lives on three Arduino Uno microcontrollers that interface between the hardware (motors and sensors) and the Raspberry Pi. We initially wanted to use microROS on Raspberry Pi Picos to communicate with the software system using ROS protocol. With such an ambitious project we decided to pivot to Arduino Unos and communicate over serial. We decided to have different Arduinos for receiving motor commands and passing along sensor data to simplify serial communication. For sensors without specific open source libraries such as the buttons, encoders, and strain gauges, we wrote our own custom libraries.
+
+
+
+
+
+
+
+
+
+
+
Motor Firmware:
+
+
+
+ The Arduino receives serial messages from the Raspberry Pi and parses them to determine if they are meant for the drivetrain motors or the module motor. Once it has done so, it processes the rest of the message; either a twist (linear and angular velocity) for the drivetrain, or angle for the module motor. For the drivetrain, we calculate the left and right wheel velocities, and using an open source library, send PWM signals to the motor drivers. If it is a command for the module motor, we calculate the number of steps the stepper motor needs to move, and use an open source library to send the stepper motor to the specified angle.
+
+
+
+
+
+
+
+ Chassis Sensor Firmware:
+
+
+
+ The Arduino on the body of the robot processes and relays data from the buttons, encoders, and IMU to the Raspberry Pi. With each message it identifies what sensor the data originated from using a two character code. The firmware for the buttons is a library containing a button class and two methods which initialize the button, and check if it has changed states. The encoder firmware reads two channels, which are identical square waves offset by 90 degrees, where the number of pulses is the speed, and which channel is ahead of the other is the direction of rotation. The encoder firmware library has a robot class with three methods. The methods each initialize the robot with both encoders, return linear velocity, and return angular velocity respectively. The IMU had an open source firmware library already written which we leveraged to get linear and angular acceleration readings.
+
+
+
+
+
+
+
+ Tray Module:
+
+
+ The tray module firmware consists of managing four strain gauges to determine if a plate has been removed or placed while the tray is extended. We wrote a custom library for the strain gauges as well, which includes a tray class and two methods. The first method initializes the tray with all four strain gauges, and the second checks if a plate has been added or removed by checking to see if the state of the tray has changed in the correct order as the tray has extended and a plate has been added or removed.
+
+
+
There are three distinct mechanical systems of the robot: the drivetrain, module interface system, and tray module. This diagram explains the flow of energy in our mechanical system:
+
+
+
+
+
+
+
+
+
+
+
+ Drivetrain:
+
+
+
+
+ Our robot’s drivetrain consists of two centered drive wheels mounted to a 2ft x 2ft chassis made out of 1010 aluminum extrusion. Two caster wheels are mounted to either side to stabilize the chassis while still allowing the robot to be driven with only two powered wheels. The powered wheels are each connected to a DC motor through a belt system with a 2:1 gear ratio.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Module Interfacing System:
+
+
+ Our robot uses a modular interfacing system to allow for seamless switching between different modules, including our serving tray module and other potential features. We use a module housing unit mounted to the top frame of our robot. The module housing is 3D-printed with ABS, and uses a stepper motor connected to a series of gears, the end of which protrudes into the interior of the housing.
+
+
+
+
+
+
+
+
+
+
+
+
Each module contains a module receiver, another 3D printed piece designed to interlock with the module housing. To insert a module, the receiver can be easily slid into the housing, which allows the gear on the receiver to mesh with the gear on the housing, allowing for the transfer of rotational energy. A key on the receiver can then be turned to lock the module into place. When the stepper motor is actuated, the gears will transfer rotational energy between the housing and the receiver, and into a vertical shaft in the receiver, which the module can then use.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
This is a cross-sectional view of the module receiver secured in the module housing:
+
+
+
+
+
+
+
+
+
+
+ Tray Module:
+
+ The tray module uses a four-bar linkage to extend the tray outward, which customers can then pick up dishes off of. When the stepper motor rotates, the vertical shaft from the module receiver interfaces with a horizontal shaft via a pair bevel gears. This horizontal shaft powers the four-bar mechanism, by rotating a crank connected to the linkages. The linkages are made from laser-cut acrylic, and attached to the bottom board and the tray through 3D-printed brackets that act as pin joints, allowing the links to rotate but otherwise lock them in place.
+
+
+ We also installed strain gauges between the linkages and the bottom of the tray, and wired them to a central arduino, so we could detect when a dish had been removed from the tray.
+
+
+
Walter's software system runs entirely using ROS2, a robotics middleware tool that allows us to abstract our robot’s software into a network of independent tasks called “nodes”. We created the following system diagram to explain how Walter’s ROS2 nodes pass information, perform calculations, and then act upon the world autonomously:
+
+
+
+
+
+
+
+
+
+
+
The robot’s software has three main system components: state estimation and localization, path planning and execution, and module control. Critical information such as its current position in the world, or its current destination, are passed between the nodes and used to make decisions. This is a completely autonomous system in which the only human input comes from pressing buttons to request a “delivery” to a certain destination.
+
+
+
+
+
+
+ State Estimation and Localization:
+
+
+
+
+
+ For this project, we want Walter to be able to localize where he is in the world frame regardless of where he is. The approach that we chose was to use computer vision to detect apriltags, and through frame transformations, return a pose estimate of where Walter is within the world. Our first step is to calibrate our camera. To do this, we will need the following things: a camera ROS node that publishes images, a camera calibration ROS node, and a 7x9 white calibration paper.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
Once we are done, we save the calibration file in ~/.ros/camera_info/[camera_name].yaml so that it could be accessed by the camera node in the future. Once this is complete, we set up the Apriltag ROS node developed by Christian Rauch. Fortunately for us, the Apriltag ROS node publishes transformations to the /tf topic. In this topic, we will get information about where the apriltag is with regards to the camera frame. We simply take the inverse of the transformation provided TR, where R is the rotation matrix and T is the translation matrix. This will result in the matrix R-1T-1. Finally, we apply the transformation between the world frame and the Apriltag to give us our final result. We publish the result as a Pose object.
+
+
+
+
+
+
+
+
+
+
+
Path Planning and Execution:
+
+ Walter’s path planning system includes a map maker, a path planner, and a motion execution node, which work together to ensure Walter efficiently gets to where he needs to go. The map maker manages his reference map of the world, including AprilTag landmarks and table coordinates, using the ROS2 TransformManager tool. When a person presses a button to indicate where Walter should go next, the motion execution node uses this reference map along with a recent estimate of Walter’s location to figure out the velocities necessary to get him there quickly. This diagram showcases how Walter prioritizes correcting angular error over linear error to get to his destination:
+
+
+
+
+
+
+
+
+
+
+
+
If Walter receives multiple destination requests, the path planner files them into a queue and only allows the most recent request to be visible to the rest of the system – this avoids routing interruptions and keeps Walter’s path stable. The path planner can be configured to either send Walter from one destination to the next, or to require him to return to the kitchen each time he completes a delivery. The two modes are visualized below:
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Module Control and Serial Data:
+
+
+ The module control node is in charge of controlling when to extend and retract the four-bar module. When Walter reaches a new destination, module control node actuates the four-bar to offer food. When someone removes weight from the four-bar, the strain gauges underneath communicate this to the model control node and it commands the four-bar to retract. When the four-bar is completely retracted, the node will indicate to the path planner that Walter is ready to pilot to his next goal. An important aspect of this control is its ability to track its status. We split the system into four following statuses. 0. Walter hasn’t reached its destination
+
+
+
+ 1. Extend the arm
+
+ 2. Retract the arm
+
+ 3
+ . Waiting for next goal location By doing this, we can specifically tell Walter that it shouldn’t move when the four-bar module status is at 1 and 2. Furthermore, we can publicly publish to all other nodes so that other nodes will know to set the next location when the four-bar module status is at 3.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
To communicate with our Arduinos, we created a serial ROS node that either parses data coming from the read port or strain gauge port and sends data to the write port. The table below shows the codes we used to make sure our firmware and software are properly integrated.
Walter the WaiterBot is an autonomous food delivery robot that can navigate a restaurant-like environment and deliver plates to customers.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
He has a four-bar-actuated tray module that extends upon reaching a destination to serve plates of food to the customer.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
The four-bar system is powered by mechanical output and can be swapped out with another module without any rewiring or bolts.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
To navigate the environment, Walter uses AprilTags and a reference map to estimate its location in the world. All of Walter’s software is structured as a ROS2 node network with state estimation and navigation components.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
Arduino UNOs running custom firmware serve as the bridge between the motors and sensors and the software system, relaying data from one to the other.
+
+
+
+
+
+
+
+
+
+
+
+
+
This project was made for our Principles of Integrated Engineering class, taken in the fall semester of our sophomore year at Olin College of Engineering. We had a total of 6 weeks to complete this project, which we divided into three sprints.
+
+ Team Learning Goals:
+ - Mechanical: To design and build complex, interesting mechanical systems. - Electrical: To design a robust power distribution system. - Firmware: To write custom firmware and firmware libraries in C++. - Software: To design a software architecture entirely in ROS2.
+ To see more details, please look around this website!
+