This readme is obviously a work in progress!
In order to run this locally, you will need k3d installed, as well as kubectl. Your cluster will need to able to see the internet, specifically this repository in GitHub, because Argo CD pulls config to manage the installation of applications.
Note, this example spins up roughly 20 pods in the cluster, even before you run a workflow, so you'll need to allocate a half-decent amount of resources to Docker to run this example.
The summary is "probably don't". If you really want to run on a remote cluster, you will need to use an empty cluster that you don't care about. This script blindly installs and configures a number of tools and has no regard for what you already have installed. It also makes assumptions about things like the container registry URL and ingress URLS that you will need to manually change in order to make it work.
You will need to remove the k3d cluster creation step from the script.
Creating the cluster and bootstrapping it is done with one simple script:
chmod +x setup.sh
./setup.sh
k3d cluster delete large-workflows
Name | URL | username | password |
---|---|---|---|
Argo Workflows | https://localhost:30000 | - | - |
Minio UI | http://localhost:30002 | pipekit |
sup3rs3cr3tp4ssw0rd1 |
Name | URL | username | password |
---|---|---|---|
Postgres | localhost:30003 | postgres |
sup3rs3cr3tp4ssw0rd1 |
Minio has been set up with two buckets, pipekit
and workflows
. Argo Workflows is configured to use workflows
as the bucket to store artifacts in (including logs). The pipekit
bucket is connected to the Argo Events Minio Eventsource. This means that any file that is uploaded to the pipekit
bucket will trigger an Argo Event.
You can log into the Minio UI (see details above) to upload something via your browser, or use our pre-canned script to upload a file:
todo
Postgres contains two databases; workflow
for log archiving of argo workflows and pipekit
for our application.