Skip to content

v1.11.0

Compare
Choose a tag to compare
@ladybugbot ladybugbot released this 21 May 13:49
· 413 commits to master since this release

1.11.0 (2020-05-21)

Bug Fixes

  • baked recipe: fix inherited validators to work with recipe and baked recipe (f0676cb)
  • cli: clean code and fix docs (d8169e7)
  • cli: replace recipe dependency with simple install (ec1eafd)
  • conf.py: create _static/schemas folder if doesn't exist (b209fa6)
  • dependency: fix package_url (5b6b0c0)
  • function: fix command validation (95001eb)
  • functions: reference value check (7671336)
  • lists: sort schema lists deterministically (84143e3)
  • operator: use snake_case for schema keys (1f1ce5a), closes #50
  • packaging: lenient packaging path location (85264ab)
  • recipe: better validation and validation messages (7af3c2d)
  • recipe: check template names (26783d7)
  • recipe: don't lock dependencies when writin folder (00f26d9)
  • recipe: fix dependency lock and Windows separator (d509da7)
  • recipe: fix entrypoint check to work with classes inheriting from Recipe (4b7d8e8)
  • recipe: read/write from/to folders (c65ef7f)
  • repository: generate from folder, enable merge (4425365)
  • serialization: remove "Enum" classes from dict and pass exclude_unset option (3cf2e22)
  • workflow: nest recipe inside of workflow instead of inheriting (b6e0542)

Features

  • cli: add repository serve and init commands (7f8705f)
  • cli: commands to generate/manage recipes, operators and repositories (6448627), closes #47
  • functions: update function and operator schema (b596a5b)
  • schema: mega refactor! (1171778), closes #39 #41 #42
  • schemas: create repository schema, read/write from folder and create archives (1cb53e5), closes #47 #42
  • workflow: generate workflow from a baked recipe (183561a)

Performance Improvements

  • folders: write yaml folders without empty values (9f3e9d7)

Reverts

  • artifact location: kill it with 🔥 (194862e)

Details

Separate Workflow into Operators, Recipes and Workflows

Two main issues led us to split the old Workflow object into three separate objects:

  • Confusion between Workflow template and Workflow object executed by Argo or Luigi #39
  • Re-usability of Workflows

For this reason we introduce the three following Queenbee Objects:

  • Operator: A list of functions (templated commands) for 1 cli tool (eg: radiance-operator would template command for radiance)
  • Recipe: A reusable DAG definition of Operator Functions or other Recipes (#42) (ie: a recipe can use another recipe inside of itself)
  • Workflow: A Recipe + Input Arguments run by an executor (ie: luigi or argo). The schema will not be used by Queenbee explicitely but serves as a common definitions or interface for the different execution engines to report progress of an executed Recipe.

Introduce Dependency Management

Despite all warnings we went ahead and built a package manager... This was a requirement for Recipes to be shareable and modular (ie: a recipe could refer to a certain operator without having to copy all the operator data).

To do so we drew heavy inspiration from the Helm project (#47 ). This means that Queenbee packaging and sharing remains open sourced and freely available for those who wish to create their own registries of Recipes and Operators.

Artifact Path Context (remove artifact_location)

Artifact Locations have been removed to make recipes more reusable. Hard coding the location to persist a file inside a DAG lead to re-usability issues when nesting Recipes inside each other. It was therefore preferable to establish the concept of artifact path context:

Queenbee Workflow - Cloud Path Context

Queenbee Workflow - Local Run Path Context

Function Path Context

When an artifact path is declared within an Operator Function, this path is only relevant or valid in the context of this function. In practice this means that a Function doesn't care where an artifact is persisted when it is executed because it knows that the executor (argo or luigi) will move copy the artifact to the path it expects it to be in the command.

name: ray-tracing
description: Run ray tracing using some input data!
inputs:
  parameters:
  - name: radiance-parameters
    description: a string of radiance parameters
    default: -b 5
  artifacts:
  - name: grid
    path: grid.pts
  - name: scene-file
    path: scene.oct
command: rtrace -I -h {{inputs.parameters.radiance-parameters}} scene.oct < grid.pts > grid.res
outputs:
  artifacts:
  - name: result-file
    path: grid.res

Recipe Path Context

When an artifact path is declared in a Recipe DAG it points to a folder or S3 bucket location where it expects the artifact to be pulled from (input) or pushed to (output). This path does not overwrite the one in the Operator Function context. Instead the executor (argo or luigi) will copy the artifact from the Recipe Context to the Function Context for inputs and vice versa for outputs

name: main
inputs:
  parameters:
  - name: sensor-grid-count
    description: The maximum number of grid points per parallel execution
    default: 100
  - name: radiance-parameters
    description: The radiance parameters for ray tracing
    default: -I -ab 2 -h
  
  artifacts:
  - name: model
    description: A Honeybee model with radiance properties
    required: true
  - name: input-grid
    description: A grid file
    required: true

tasks:
   ...

- name: daylight-factor-simulation
  template: honeybee-radiance/ray-tracing
  dependencies:
  - split-grid
  - create-octree
  loop:
    from:
      type: tasks
      name: split-grid
      variable: grid-list
    sub_folder:
    - item.name
  arguments:
    parameters:
    - name: radiance-parameters
      from:
        type: inputs
        variable: radiance-parameters
    artifacts:
    - name: grid
      from:
        type: tasks
        name: split-grid
        variable: output-grids-folder
      subpath: '{{item.name}}'
    - name: scene-file
      from:
        type: tasks
        name: create-octree
        variable: scene-file
  outputs:
    artifacts:
    # This file is persisted in a loop sub-folder so the name won't clash
    - name: result-file
      path: grid.res
 ...
outputs:
  artifacts:
  - name: data
    from:
      type: tasks
      name: post-process
      variable: post-process-folder
  parameters:
  - name: average
    from:
      type: tasks
      name: post-process
      variable: daylight-factor-average

Overhaul Documentation

The previous documentation page consisted of a Redoc template pointing to the old Workflow schema. As we have introduced new separate schemas as well as development practices for Operator and Recipe creation/management we needed to introduce some written tutorials and more in depth exploration pages for each schema. We therefore overhauled the Redoc page and went for a full Sphinx page that matches styling from other ladybug tools packages.