Skip to content

Latest commit

 

History

History
214 lines (139 loc) · 11.1 KB

File metadata and controls

214 lines (139 loc) · 11.1 KB

build-and-inspect-python-package logo
Never upload a faulty Python package to PyPI again.

build-and-inspect-python-package is a GitHub Action that provides the following functionality to Python package maintainers:

Builds your package1. SOURCE_DATE_EPOCH is set to the timestamp of the last commit, giving you reproducible builds with meaningful file timestamps.

Uploads the built wheel and the source distribution (SDist) as GitHub Actions artifacts, so you can download and inspect them from the Summary view of a run, or upload them to PyPI automatically once the verification succeeds.

Lints the wheel contents using check-wheel-contents.

Lints the PyPI README using Twine and uploads it as an GitHub Actions artifact for further manual inspection. To level up your PyPI README game, check out hatch-fancy-pypi-readme!

Prints the tree of both SDist and wheel in the CI output, so you don’t have to download the packages, if you just want to check the content list.

Prints and uploads the packaging metadata as a GitHub Actions artifact.

Popular Use Cases

Build Once – Use Across Jobs

To increase the fidelity of your tests to what your users will experience, you can build and store your package as a first step, depend on the step in the remaining steps, and – instead of checking out the source tree – retrieve the built packages and run your tests against that. For example, by unpacking the tests and config from the SDist and using tox run --installpkg dist/*.whl ... to run the tests against the built wheel without access to the package source code.

You can see this technique in action in structlog’s CI.

Automatic Uploading

You can use a workflow that builds your package and – depending on the CI event (push to main, new tag, new release, ...) – uses PyPI’s trusted publisher feature to upload it to Test PyPI2, PyPI, or both. This way you can continuously check how the package will look on PyPI.

structlog uses this technique too: It uploads every commit on main to Test PyPI and whenever a GitHub Release is created, also to the real PyPI.

Define Python Version Matrix Based On Package Metadata

build-and-inspect-python-package extracts the Python versions your package supports from the trove classifiers in your package’s metadata and offers them as an action output.

That means that you can define your CI matrix based on the Python versions your package supports without duplicating the information between your package configuration and your CI configuration.

Applications

If you package an application as a Python package, this action is useful to double-check you’re shipping everything you need, including all templates, translation files, et cetera.

Usage

build-and-inspect-python-package only works on Linux runners:

jobs:
  build-and-inspect-package:
    name: Build & inspect package.
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v4
      - uses: hynek/build-and-inspect-python-package@v2

To also upload to PyPI:

jobs:
  build-and-inspect-package:
    name: Build & inspect package.
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v4
      - uses: hynek/build-and-inspect-python-package@v2


  upload-to-pypi:
    name: Upload package to PyPI
    needs: build-and-inspect-package
    runs-on: ubuntu-latest
    permissions:
      # IMPORTANT: this permission is mandatory for trusted publishing, but
      # should NOT be granted anywhere else!
      id-token: write

    steps:
      - name: Download built artifact to dist/
        uses: actions/download-artifact@v4
        with:
          name: Packages
          path: dist
      - uses: pypa/gh-action-pypi-publish@release/v1

Important

For security reasons, keep the job that has the id-token: write permission as short as possible.


If you’re using a VCS tag-based version extractor like setuptools-scm and need the built package to have the correct version, you must use actions/checkout with fetch-depth: 0 – unless the latest commit is the version tag.

Caution

build-and-inspect-python-package uses actions/upload-artifact for storing the built artifacts that you can download with actions/download-artifact.

Unfortunately, v4 of both is incompatible with previous versions, so you have to make sure that your download-artifact version matches the version that build-and-inspect-python-package uses for uploading.

  • If you’re using download-artifact@v3, you have to use build-and-inspect-python-package@v1.
  • If you’re using download-artifact@v4, you have to use build-and-inspect-python-package@v2.

While build-and-inspect-python-package will build a wheel for you by default, we recommend using cibuildwheel if your package contains compiled extensions.

Inputs

  • path: the location of the Python package to build (optional, default: .).

  • skip-wheel: Whether to skip building the wheel in addition to the source distribution. The only meaningful value is 'true' (note the quotes – GitHub Actions only allow string inputs) and everything else is treated as falsey.

    This is useful if you build your wheels using advanced tools like cibuildwheel anyway. (optional, default: 'false').

  • upload-name-suffix: A suffix to append to the artifact names to make them unique for upload-artifact@v4.

    Use this if you want to build multiple packages in one workflow. (optional, default: '').

  • attest-build-provenance-github: Whether to generate signed build provenance attestations for workflow artifacts using actions/attest-build-provenance. Requires attestations: write and id-token: write permissions. The only meaningful value is 'true' (note the quotes – GitHub Actions only allow string inputs) and everything else is treated as falsey. (optional, default: 'false').

Important

GitHub's artifact attestations are different from PyPI's Sigstore attestations that you can generate while uploading using pypa/gh-action-pypi-publish.

Outputs

  • artifact-name: The name of the uploaded artifact.

  • dist: The location with the built packages.

    See, for example, how argon2-cffi-bindings uses this feature to check the built wheels don’t break a package that depends on it.

  • supported_python_classifiers_json_array: A JSON array of Python versions that are supported by the package as defined by the trove classifiers in the package metadata (for example, Programming Language :: Python :: 3.12).

    You can assign this to a matrix strategy key in your CI job (for example, strategy.matrix.python-version) to test against multiple Python versions without duplicating the information. Since GitHub Actions only allows for strings as variables, you have to parse it with fromJSON in your workflow.

    If all this sounds confusing: Check out our supported Pythons CI workflow for a realistic example.

  • supported_python_classifiers_json_job_matrix_value: Same as supported_python_classifiers_json_array, but it’s a mapping with the JSON array bound to the python-version key.

    This is useful if you only want to define a matrix based on Python versions, because then you can just assign this to strategy.matrix.

  • package_version: The version of the package as extracted from the package metadata.

    This is useful, for example, for displaying the PyPI URL on the GitHub UI for the publishing job:

    jobs:
      ...
      release:
        runs-on: ubuntu-latest
        needs: baipp
        environment:
          name: pypi
          url: https://pypi.org/project/structlog/${{ needs.baipp.outputs.package-version }}

Artifacts

After a successful run, you’ll find the following artifacts in the run’s Summary view:

  • Packages: The built packages. Perfect for automated PyPI upload workflows!
  • Package Metadata: the extracted packaging metadata (hint: it’s formatted as an email).
  • PyPI README: the extracted PyPI README, exactly how it would be used by PyPI as your project’s landing page. PEP 621 calls it readme, in classic setuptools it’s long_description.

Job Summaries

To save you from downloading the artifacts just to check their contents, build-and-inspect-python-package creates the following job summaries:

  • SDist contents: A tree of the source distribution.
  • Wheel contents: A tree of the built wheel – if one was built. This output has no timestamps because wheel unpack does not preserve them from the built wheel, leading to confusion.
  • Metadata: A plain-text dump of package metadata (includes the PyPI README).

Examples

Our CI uses all inputs and outputs, if you want to see them in action.

Our supported Pythons CI workflow demonstrates how to use supported_python_classifiers_json_array to set up a matrix of Python versions for your CI jobs without duplicating the information with your packaging metadata.

License

The scripts and documentation in this project are released under the MIT License.

Footnotes

  1. Works with any PEP 517-compatible build backend. This includies Hatchling, Flit, Setuptools, PDM, and Poetry.

  2. Note, though, that a prerequisite for the Test PyPI workflow is that each of your commits builds with a unique version number. This is easily achievable using tools like setuptools-scm or hatch-vcs, but beyond the scope of this humble README.