This repository shows how to orchestrate a machine learning workflow with Kubeflow and Hydrosphere Serving.
- Admin access to a Kubernetes cluster
- Kubeflow
- Hydrosphere Serving
Note: All components of Kubeflow by default will be installed into kubeflow
namespace.
- Build and publish all stages of the workflow 01-07
- Adjust
pipeline.py
to point to your published images - Compile the pipeline
$ python pipeline.py pipeline.tar.gz && tar -xvf pipeline.tar.gz
This will create two files for you: pipeline.yaml
and pipeline.tar.gz
. You can use both of these files to start a pipeline execution.
-
(Recommended) Kubeflow Pipelines
- UI
- Open Kubeflow UI and upload
pipeline.yaml
withUpload Workflow
button - Create an experiment and make a run using this pipeline
- Open Kubeflow UI and upload
- Shell
- Adjust
client.py
as needed: 1) specify compiled pipeline definition; 2) specify ml-pipeline endpoint; 3) specify experiment name. - Execute
python client.py
- Adjust
- UI
-
Argo Workflows
- Install argo
- Submit a workflow
$ argo submit pipeline.yaml --watch