Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gh-actions: Build and upload all artifacts #222

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
100 changes: 100 additions & 0 deletions .github/workflows/generate-morphs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@
# This workflow builds Data Morph on Mac, Linux, and Windows for
# multiple versions of Python to confirm it can be properly installed.
#
# For more information see https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python
Comment on lines +1 to +4
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be updated 😊

#
# Author: Stefanie Molin, Daniel Schaefer

name: Generate Morphs

on:
push:
pull_request:
paths-ignore:
- 'docs/**'
- 'src/tests/**'
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- 'src/tests/**'
- 'tests/**'

Comment on lines +13 to +15
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about running only if we have changes in src/**?



concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
cancel-in-progress: true

jobs:
build:
name: Build with Python ${{ matrix.python-version }} on ${{ matrix.os }}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
name: Build with Python ${{ matrix.python-version }} on ${{ matrix.os }}
name: Run Data Morph on new/altered datasets/shapes


# Just generate on one operating system, they should all be the same
runs-on: ubuntu-latest

defaults:
run:
shell: bash -e {0}

strategy:
fail-fast: false

steps:
- uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v5

- name: Install Data Morph
run: |
python -m pip install --upgrade pip
python -m pip install setuptools --upgrade
python -m pip install .

- name: Get all dataset files that have changed
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- name: Get all dataset files that have changed
- name: Get all dataset and shape files that have changed

id: changed-files-yaml
uses: tj-actions/changed-files@v45
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's add a comment with the documentation link for this.

with:
files_yaml: |
dataset:
- src/data_morph/data/starter_shapes/*
shape:
- src/data_morph/shapes/**

# If datasets are added or changed
- name: Generate morphs from new or changed datasets
if: steps.changed-files-yaml.outputs.dataset_any_changed == 'true'
env:
DATASET_ALL_CHANGED_FILES: ${{ steps.changed-files-yaml.outputs.dataset_all_changed_files }}
run: |
echo "One or more dataset file(s) has changed."
echo "List all the files that have changed: $DATASET_ALL_CHANGED_FILES"
Comment on lines +64 to +65
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
echo "One or more dataset file(s) has changed."
echo "List all the files that have changed: $DATASET_ALL_CHANGED_FILES"
echo "Detected changes to dataset(s): $DATASET_ALL_CHANGED_FILES"

DATASET_ARGS=$(python bin/ci.py $DATASET_ALL_CHANGED_FILES)
echo "Generating morphs for the following datasets: $DATASET_ARGS"
parallel -j0 data-morph \
--start-shape $DATASET_ARGS \
--target-shape {} \
::: bullseye heart rectangle star slant_up

# If shapes are added or modified
- name: Generate morphs from new or changed shapes
if: steps.changed-files-yaml.outputs.shape_any_changed == 'true'
env:
SHAPE_ALL_CHANGED_FILES: ${{ steps.changed-files-yaml.outputs.shape_all_changed_files }}
run: |
echo "One or more shape file(s) has changed."
echo "List all the files that have changed: $SHAPE_ALL_CHANGED_FILES"
Comment on lines +79 to +80
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
echo "One or more shape file(s) has changed."
echo "List all the files that have changed: $SHAPE_ALL_CHANGED_FILES"
echo "Detected changes to shape(s): $SHAPE_ALL_CHANGED_FILES"

SHAPE_ARGS=$(python bin/ci.py $SHAPE_ALL_CHANGED_FILES)
echo "Generating morphs for the following shapes: $SHAPE_ARGS"
data-morph \
--start-shape music \
--target-shape $SHAPE_ARGS \
Comment on lines +83 to +85
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we use parallel here like the other two?


# For core code changes, we want to do a couple morphs to see if they still look ok
# Only need to run if neither of the previous two morphs ran
- name: Morph shapes
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- name: Morph shapes
- name: Morph shapes with core code changes

if: steps.changed-files-yaml.outputs.dataset_any_changed != 'true' && steps.changed-files-yaml.outputs.shape_any_changed != 'true'
run: |
parallel -j0 data-morph \
--start-shape music \
--target-shape {} \
::: bullseye heart rectangle star slant_up

- uses: actions/upload-artifact@v4
with:
name: morphed_data
path: morphed_data
46 changes: 46 additions & 0 deletions bin/ci.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
#!/usr/bin/env python
"""
Pass in the filenames that changed and it'll tell you the arguments of datasets and shapes.
See examples below
Comment on lines +3 to +4
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Pass in the filenames that changed and it'll tell you the arguments of datasets and shapes.
See examples below
Call this script with the names of files that have changed to get the
datasets and shapes to test with the CLI.
Examples
--------


$ python bin/ci.py src/data_morph/shapes/circles.py
bullseye circle rings

$ python bin/ci.py src/data_morph/shapes/bases/line_collection.py
high_lines h_lines slant_down slant_up v_lines wide_lines x diamond rectangle star

python bin/ci.py src/data_morph/data/starter_shapes/superdatascience.csv
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
python bin/ci.py src/data_morph/data/starter_shapes/superdatascience.csv
$ python bin/ci.py src/data_morph/data/starter_shapes/superdatascience.csv

SDS
"""
from data_morph.shapes.factory import ShapeFactory
from data_morph.data.loader import DataLoader
import sys
from os.path import basename
Comment on lines +15 to +18
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I figured out why the linting action didn't run on this file. This PR should also add the bin directory to the check-pr.yml workflow:

on:
pull_request:
paths:
- 'docs/**'
- 'src/**'
- 'tests/**'

Once you add that, there will be things to fix.


new_paths = sys.argv[1:]

args = []

# Figure out argument of datasets based on .csv filename
for dataset, filename in DataLoader._DATASETS.items():
for new_file in new_paths:
if filename in new_file:
args.append(dataset)

# Figure out argument of shapes based on .py filename
new_files = [basename(x).split('/')[-1] for x in new_paths]
for shape, c in ShapeFactory._SHAPE_MAPPING.items():
for new_file in new_files:
# Find the class and all parent classes and get their module name
# We get the module name because it ends in the python file without .py extension
# To make it easy to compare, we just add the extension onto the end
parents = [x.__module__ for x in c.__mro__]
all_modules = parents + [c.__module__]
all_modules = [f'{x}.py' for x in all_modules]

for module in all_modules:
if module.endswith(new_file):
args.append(shape)
break

print(' '.join(args))
Loading