From 1f76141644ba3f57dc134c01f22badd0c16b3082 Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" Date: Mon, 3 Jun 2024 15:59:12 +0000 Subject: [PATCH] Deployed 6765e6e to dev with MkDocs 1.5.3 and mike 2.0.0 --- dev/search/search_index.json | 2 +- dev/sitemap.xml.gz | Bin 418 -> 418 bytes dev/tutorials/cpp/index.html | 23 ++++++++++++++++------- 3 files changed, 17 insertions(+), 8 deletions(-) diff --git a/dev/search/search_index.json b/dev/search/search_index.json index 4b7782baa..2bb528738 100644 --- a/dev/search/search_index.json +++ b/dev/search/search_index.json @@ -1 +1 @@ -{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Home","text":""},{"location":"#rattler-build-a-fast-conda-package-builder","title":"rattler-build: A Fast Conda Package Builder","text":"

The rattler-build tooling and library creates cross-platform relocatable binaries / packages from a simple recipe format. The recipe format is heavily inspired by conda-build and boa, and the output of a regular rattler-build run is a package that can be installed using mamba, rattler or conda.

rattler-build does not have any dependencies on conda-build or Python and works as a standalone binary.

"},{"location":"#installation","title":"Installation","text":"

You can grab a prerelease version of rattler-build from the Github Releases.

It is (of course) also available from conda-forge:

pixi global install rattler-build\n# or with micromamba\nmicromamba install rattler-build -c conda-forge\n

Alternatively, you can install rattler-build via Homebrew:

brew install rattler-build\n

rattler-build is also available on Arch Linux in the extra repository:

pacman -S rattler-build\n
"},{"location":"#dependencies","title":"Dependencies","text":"

Currently rattler-build needs some dependencies on the host system which are executed as subprocess. We plan to reduce the number of external dependencies over time by writing what we need in Rust to make rattler-build fully self-contained.

On Windows, to obtain these dependencies from conda-forge, one can install m2-patch, m2-bzip2, m2-gzip, m2-tar.

"},{"location":"#documentation","title":"Documentation","text":"

We have extensive documentation for rattler-build. You can find the book here.

"},{"location":"#github-action","title":"GitHub Action","text":"

There is a GitHub Action for rattler-build. It can be used to install rattler-build in CI/CD workflows and run a build command. Please check out the GitHub Action documentation for more information.

"},{"location":"#usage","title":"Usage","text":"

rattler-build comes with two commands: build and test.

The build command takes a --recipe recipe.yaml as input and produces a package as output. The test subcommand can be used to test existing packages (tests are shipped with the package).

"},{"location":"#the-recipe-format","title":"The recipe format","text":"

Note You can find all examples below in the examples folder in the codebase and run them with rattler-build.

A simple example recipe for the xtensor header-only C++ library:

# yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\n\ncontext:\n  name: xtensor\n  version: 0.24.6\n\npackage:\n  name: ${{ name|lower }}\n  version: ${{ version }}\n\nsource:\n  url: https://github.com/xtensor-stack/xtensor/archive/${{ version }}.tar.gz\n  sha256: f87259b51aabafdd1183947747edfff4cff75d55375334f2e81cee6dc68ef655\n\nbuild:\n  number: 0\n  script:\n    - if: win\n      then: |\n        cmake -G \"NMake Makefiles\" -D BUILD_TESTS=OFF -D CMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% %SRC_DIR%\n        nmake\n        nmake install\n      else: |\n        cmake ${CMAKE_ARGS} -DBUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=$PREFIX $SRC_DIR -DCMAKE_INSTALL_LIBDIR=lib\n        make install\n\nrequirements:\n  build:\n    - ${{ compiler('cxx') }}\n    - cmake\n    - if: unix\n      then: make\n  host:\n    - xtl >=0.7,<0.8\n  run:\n    - xtl >=0.7,<0.8\n  run_constraints:\n    - xsimd >=8.0.3,<10\n\ntests:\n  - script:\n    - if: unix or emscripten\n      then:\n        - test -d ${PREFIX}/include/xtensor\n        - test -f ${PREFIX}/include/xtensor/xarray.hpp\n        - test -f ${PREFIX}/share/cmake/xtensor/xtensorConfig.cmake\n        - test -f ${PREFIX}/share/cmake/xtensor/xtensorConfigVersion.cmake\n    - if: win\n      then:\n        - if not exist %LIBRARY_PREFIX%\\include\\xtensor\\xarray.hpp (exit 1)\n        - if not exist %LIBRARY_PREFIX%\\share\\cmake\\xtensor\\xtensorConfig.cmake (exit 1)\n        - if not exist %LIBRARY_PREFIX%\\share\\cmake\\xtensor\\xtensorConfigVersion.cmake (exit 1)\n\nabout:\n  homepage: https://github.com/xtensor-stack/xtensor\n  license: BSD-3-Clause\n  license_file: LICENSE\n  summary: The C++ tensor algebra library\n  description: Multi dimensional arrays with broadcasting and lazy computing\n  documentation: https://xtensor.readthedocs.io\n  repository: https://github.com/xtensor-stack/xtensor\n\nextra:\n  recipe-maintainers:\n    - some-maintainer\n

A recipe for the rich Python package (using noarch):

context:\n  version: \"13.4.2\"\n\npackage:\n  name: \"rich\"\n  version: ${{ version }}\n\nsource:\n  - url: https://pypi.io/packages/source/r/rich/rich-${{ version }}.tar.gz\n    sha256: d653d6bccede5844304c605d5aac802c7cf9621efd700b46c7ec2b51ea914898\n\nbuild:\n  # Thanks to `noarch: python` this package works on all platforms\n  noarch: python\n  script:\n    - python -m pip install . -vv --no-deps --no-build-isolation\n\nrequirements:\n  host:\n    - pip\n    - poetry-core >=1.0.0\n    - python 3.10\n  run:\n    # sync with normalized deps from poetry-generated setup.py\n    - markdown-it-py >=2.2.0\n    - pygments >=2.13.0,<3.0.0\n    - python 3.10\n    - typing_extensions >=4.0.0,<5.0.0\n\ntests:\n  - python:\n      imports:\n        - rich\n      pip_check: true\n\nabout:\n  homepage: https://github.com/Textualize/rich\n  license: MIT\n  license_file: LICENSE\n  summary: Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal\n  description: |\n    Rich is a Python library for rich text and beautiful formatting in the terminal.\n\n    The Rich API makes it easy to add color and style to terminal output. Rich\n    can also render pretty tables, progress bars, markdown, syntax highlighted\n    source code, tracebacks, and more \u2014 out of the box.\n  documentation: https://rich.readthedocs.io\n  repository: https://github.com/Textualize/rich\n

A recipe for the curl library:

context:\n  version: \"8.0.1\"\n\npackage:\n  name: curl\n  version: ${{ version }}\n\nsource:\n  url: http://curl.haxx.se/download/curl-${{ version }}.tar.bz2\n  sha256: 9b6b1e96b748d04b968786b6bdf407aa5c75ab53a3d37c1c8c81cdb736555ccf\n\nbuild:\n  number: 0\n\nrequirements:\n  build:\n    - ${{ compiler('c') }}\n    - if: win\n      then:\n        - cmake\n        - ninja\n    - if: unix\n      then:\n        - make\n        - perl\n        - pkg-config\n        - libtool\n  host:\n    - if: linux\n      then:\n        - openssl\n\nabout:\n  homepage: http://curl.haxx.se/\n  license: MIT/X derivate (http://curl.haxx.se/docs/copyright.html)\n  license_file: COPYING\n  summary: tool and library for transferring data with URL syntax\n  description: |\n    Curl is an open source command line tool and library for transferring data\n    with URL syntax. It is used in command lines or scripts to transfer data.\n  documentation: https://curl.haxx.se/docs/\n  repository: https://github.com/curl/curl\n

For the curl library recipe, two additional script files (build.sh and build.bat) are needed.

build.sh

#!/bin/bash\n\n# Get an updated config.sub and config.guess\ncp $BUILD_PREFIX/share/libtool/build-aux/config.* .\n\nif [[ $target_platform =~ linux.* ]]; then\n    USESSL=\"--with-openssl=${PREFIX}\"\nelse\n    USESSL=\"--with-secure-transport\"\nfi;\n\n./configure \\\n    --prefix=${PREFIX} \\\n    --host=${HOST} \\\n    ${USESSL} \\\n    --with-ca-bundle=${PREFIX}/ssl/cacert.pem \\\n    --disable-static --enable-shared\n\nmake -j${CPU_COUNT} ${VERBOSE_AT}\nmake install\n\n# Includes man pages and other miscellaneous.\nrm -rf \"${PREFIX}/share\"\n

build.bat

mkdir build\n\ncmake -GNinja ^\n      -DCMAKE_BUILD_TYPE=Release ^\n      -DBUILD_SHARED_LIBS=ON ^\n      -DCMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% ^\n      -DCMAKE_PREFIX_PATH=%LIBRARY_PREFIX% ^\n      -DCURL_USE_SCHANNEL=ON ^\n      -DCURL_USE_LIBSSH2=OFF ^\n      -DUSE_ZLIB=ON ^\n      -DENABLE_UNICODE=ON ^\n      %SRC_DIR%\n\nIF %ERRORLEVEL% NEQ 0 exit 1\n\nninja install --verbose\n
"},{"location":"authentication_and_upload/","title":"Server authentication","text":""},{"location":"authentication_and_upload/#authenticating-with-a-server","title":"Authenticating with a server","text":"

You may want to use private channels for which you need to be authenticated. To do this ephemerally you can use the RATTLER_AUTH_FILE environment variable to point to a JSON file with the following structure:

{\n    \"*.prefix.dev\": {\n        \"BearerToken\": \"your_token\"\n    },\n    \"otherhost.com\": {\n        \"BasicHttp\": {\n            \"username\": \"your_username\",\n            \"password\": \"your_password\"\n        }\n    },\n    \"anaconda.org\": {\n        \"CondaToken\": \"your_token\"\n    }\n}\n

The keys are the host names. You can use wildcard specifiers here (e.g. *.prefix.dev to match all subdomains of prefix.dev, such as repo.prefix.dev). This will allow you to also obtain packages from any private channels that you have access to.

The following known authentication methods are supported:

"},{"location":"authentication_and_upload/#uploading-packages","title":"Uploading packages","text":"

If you want to upload packages, then rattler-build comes with a built-in upload command. There are 4 options:

The command is:

rattler-build upload <server> <package_files>\n

Note: you can also use the RATTLER_AUTH_FILE environment variable to authenticate with the server.

"},{"location":"authentication_and_upload/#prefixdev","title":"prefix.dev","text":"

To upload to prefix.dev, you need to have an account and a token. You can create a token in the settings of your account. The token is used to authenticate the upload.

export PREFIX_API_KEY=<your_token>\nrattler-build upload prefix -c <channel> <package_files>\n

You can also use the --api-key=$PREFIX_API_KEY option to pass the token directly to the command. Note that you need to have created the channel on the prefix.dev website before you can upload to it.

"},{"location":"authentication_and_upload/#quetz","title":"Quetz","text":"

You need to pass a token and API key to upload to a channel on your own Quetz server. The token is used to authenticate the upload.

export QUETZ_API_KEY=<your_token>\nrattler-build upload quetz -u <url> -c <channel> <package_files>\n
"},{"location":"authentication_and_upload/#artifactory","title":"Artifactory","text":"

To upload to an Artifactory server, you need to pass a username and password. The username and password are used to authenticate the upload.

export ARTIFACTORY_USERNAME=<your_username>\nexport ARTIFACTORY_PASSWORD=<your_password>\nrattler-build upload artifactory -u <url> -c <channel> <package_files>\n
"},{"location":"authentication_and_upload/#anacondaorg","title":"anaconda.org","text":"

To upload to anaconda.org, you need to specify the owner and API key. The API key is used to authenticate the upload.

The owner is the owner of the distribution, for example, your user name or organization.

One can also specify a label such as dev for release candidates using the -c flag. The default value is main.

You can also add the --force argument to forcibly upload a new package (and overwrite any existing ones).

export ANACONDA_API_KEY=<your_token>\nrattler-build upload anaconda -o <your_username> -c <label> <package_files>\n
"},{"location":"automatic_linting/","title":"Enabling Automatic Linting in VSCode","text":"

Our new recipe format adheres to a strict JSON schema, which you can access here.

This schema is implemented using pydantic and can be rendered into a JSON schema file. The YAML language server extension in VSCode is capable of recognizing this schema, providing useful hints during the editing process.

To enable automatic linting with the YAML language server, you need to add the following line at the beginning of your recipe file:

# yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\n

Alternatively, if you prefer not to add this line to your file, you can install the JSON Schema Store Catalog extension. This extension will also enable automatic linting for your recipe files.

"},{"location":"build_options/","title":"Advanced build options","text":"

There are some specialized build options to control various features:

These are all found under the build key in the recipe.yaml.

"},{"location":"build_options/#always-include-and-always-copy-files","title":"Always include and always copy files","text":"

There are some options that control the inclusion of files in the final package.

The always_include_files option can be used to include files even if they are already in the environment as part of some other host dependency. This is normally \"clobbering\" and should be used with caution (since packages should not have any overlapping files).

The always_copy_files option can be used to copy files instead of linking them. This is useful for files that might be modified inside the environment (e.g. configuration files). Normally, files are linked from a central cache into the environment to save space \u2013 that means that files modified in one environment will be modified in all environments. This is not always desirable, and in that case you can use the always_copy_files option.

How always_copy_files works

The always_copy_files option works by setting the no_link option in the info/paths.json to true for the files in question. This means that the files are copied instead of linked when the package is installed.

recipe.yaml
build:\n  # include files even if they are already in the environment\n  # as part of some other host dependency\n  always_include_files: list of globs\n\n  # do not soft- or hard-link these files, but always copy them was `no_link`\n  always_copy_files: list of globs\n

Glob patterns

Glob patterns are used througout the build options to specify files. The patterns are matched against the relative path of the file in the build directory. Patterns can contain * to match any number of characters, ? to match a single character, and ** to match any number of directories.

For example:

"},{"location":"build_options/#merge-build-and-host-environments","title":"Merge build and host environments","text":"

In very rare cases you might want to merge the build and host environments to obtain the \"legacy\" behavior of conda-build.

recipe.yaml
build:\n  # merge the build and host environments (used in many R packages on Windows)\n  merge_build_and_host_envs: bool (defaults to false)\n
"},{"location":"build_options/#prefix-detection-replacement-options","title":"Prefix detection / replacement options","text":"

During installation time the \"install\"-prefix is injected into text and binary files. Sometimes this is not desired, and sometimes the user might want closer control over the automatic text/binary detection.

The main difference between prefix replacement for text and binary files is that for binary files, the prefix string is padded with null bytes to match the length of the original prefix. The original prefix is the very long placeholder string that you might have seen in the build process.

On Windows, binary prefix replacement is never performed.

recipe.yaml
package:\n  name: mypackage\n  version: 1.0\n\nbuild:\n  # settings concerning the prefix detection in files\n  prefix_detection:\n    # force the file type of the given files to be TEXT or BINARY\n    # for prefix replacement\n    force_file_type:\n      # force TEXT file type (list of globs)\n      text: list of globs\n      # force binary file type (list of globs)\n      binary: list of globs\n\n    # ignore all or specific files for prefix replacement`\n    ignore: bool | [path] (defaults to false)\n\n    # wether to detect binary files with prefix or not\n    # defaults to true on Unix and (always) false on Windows\n    ignore_binary_files: bool\n
"},{"location":"build_options/#variant-configuration","title":"Variant configuration","text":"

To control the variant precisely you can use the \"variant configuration\" options.

A variant package has the same version number, but different \"hash\" and potentially different dependencies or build options. Variant keys are extracted from the variant_config.yaml file and usually any used Jinja variables or dependencies without version specifier are used as variant keys.

Variant keys can also be forcibly set or ignored with the use_keys and ignore_keys options.

In order to decide which of the variant packages to prefer and install by default, the down_prioritize_variant option can be used. The higher the value, the less preferred the variant is.

More about variants can be found in the variant documentation.

The following options are available in the build section to control the variant configuration:

recipe.yaml
build:\n  # settings for the variant\n  variant:\n    # Keys to forcibly use for the variant computation\n    # even if they are not in the dependencies\n    use_keys: list of strings\n\n    # Keys to forcibly ignore for the variant computation\n    # even if they are in the dependencies\n    ignore_keys: list of strings\n\n    # used to prefer this variant less\n    down_prioritize_variant: integer (defaults to 0, higher is less preferred)\n
"},{"location":"build_options/#dynamic-linking-configuration","title":"Dynamic linking configuration","text":"

After the package is built, rattler-build performs some \"post-processing\" on the binaries and libraries.

This entails making the shared libraries relocatable and checking that all linked libraries are present in the run requirements. The following settings control this behavior.

With the rpath option you can forcibly set the rpath of the shared libraries. The path is relative to the install prefix. Any rpath setting is ignored on Windows.

The rpath_allowlist option can be used to allow the rpath to point to locations outside of the environment. This is useful if you want to link against libraries that are not part of the conda environment (e.g. proprietary software).

If you want to stop rattler-build from relocating the binaries, you can set binary_relocation to false. If you want to only relocate some binaries, you can select the relevant ones with a glob pattern.

To read more about rpaths and how rattler-build creates relocatable binary packages, see the internals docs.

If you link against some libraries (possibly even outside of the prefix, in a system location), then you can use the missing_dso_allowlist to allow linking against these and suppress any warnings. This list is pre-populated with a list of known system libraries on the different operating systems.

As part of the post-processing, rattler-build checks for overlinking and overdepending. \"Overlinking\" is when a binary links against a library that is not specified in the run requirements. This is usually a mistake because the library would not be present in the environment when the package is installed.

Conversely, \"overdepending\" is when a library is part of the run requirements, but is not actually used by any of the binaries/libraries in the package.

recipe.yaml
build:\n  # settings for shared libraries and executables\n  dynamic_linking:\n    # linux only, list of rpaths relative to the installation prefix\n    rpaths: list of paths (defaults to ['lib/'])\n\n    # Allow runpath / rpath to point to these locations\n    # outside of the environment\n    rpath_allowlist: list of globs\n\n    # wether to relocate binaries or not. If this is a list of paths, then\n    # only the listed paths are relocated\n    binary_relocation: bool (defaults to true) | list of globs\n\n    # Allow linking against libraries that are not in the run requirements\n    missing_dso_allowlist: list of globs\n\n    # what to do when detecting overdepending\n    overdepending_behavior: \"ignore\" or \"error\" # (defaults to \"error\")\n\n    # what to do when detecting overlinking\n    overlinking_behavior: \"ignore\" or \"error\" # (defaults to \"error\")\n
"},{"location":"build_script/","title":"Build scripts","text":"

The build.sh file is the build script for Linux and macOS and build.bat is the build script for Windows. These scripts contain the logic that carries out your build steps. Anything that your build script copies into the $PREFIX or %PREFIX% folder will be included in your output package.

For example, this build.sh:

mkdir -p $PREFIX/bin\ncp $RECIPE_DIR/my_script_with_recipe.sh $PREFIX/bin/super-cool-script.sh\n

There are many environment variables defined for you to use in build.sh and build.bat. Please see environment variables for more information.

build.sh and build.bat are optional. You can instead use the build/script key in your recipe.yaml, with each value being either a string command or a list of string commands. Any commands you put there must be able to run on every platform for which you build. For example, you can't use the cp command because cmd.exe won't understand it on Windows.

build.sh is run with bash and build.bat is run with cmd.exe.

"},{"location":"build_script/#environment-variables","title":"Environment variables","text":""},{"location":"build_script/#environment-variables-set-during-the-build-process","title":"Environment variables set during the build process","text":"

During the build process, the following environment variables are set, on Windows with build.bat and on macOS and Linux with build.sh. By default, these are the only variables available to your build script. Unless otherwise noted, no variables are inherited from the shell environment in which you invoke conda-build. To override this behavior, see :ref:inherited-env-vars.

ARCH Either 32 or 64, to specify whether the build is 32-bit or 64-bit. The value depends on the ARCH environment variable and defaults to the architecture the interpreter running conda was compiled with. CMAKE_GENERATOR The CMake generator string for the current build environment. On Linux systems, this is always Unix Makefiles. On Windows, it is generated according to the Visual Studio version activated at build time, for example, Visual Studio 9 2008 Win64. CONDA_BUILD=1 Always set to indicate that the conda-build process is running. CPU_COUNT Represents the number of CPUs on the system. SHLIB_EXT Denotes the shared library extension specific to the operating system (e.g. .so for Linux, .dylib for macOS, and .dll for Windows). HTTP_PROXY Inherited from the user's shell environment, specifying the HTTP proxy settings. HTTPS_PROXY Similar to HTTP_PROXY, this is inherited from the user's shell environment and specifies the HTTPS proxy settings. LANG Inherited from the user's shell environment, defining the system language and locale settings. MAKEFLAGS Inherited from the user's shell environment. This can be used to set additional arguments for the make command, such as -j2 to utilize 2 CPU cores for building the recipe. PY_VER Specifies the Python version against which the build is occurring. This can be modified with a variant_config.yaml file. PATH Inherited from the user's shell environment and augmented with the activated host and build prefixes. PREFIX The build prefix to which the build script should install the software. PKG_BUILDNUM Indicates the build number of the package currently being built. PKG_NAME The name of the package that is being built. PKG_VERSION The version of the package currently under construction. PKG_BUILD_STRING The complete build string of the package being built, including the hash (e.g. py311h21422ab_0). PKG_HASH Represents the hash of the package being built, excluding the leading 'h' (e.g. 21422ab). This is applicable from conda-build 3.0 onwards. PYTHON The path to the Python executable in the host prefix. Python is installed in the host prefix only when it is listed as a host requirement. R The path to the R executable in the build prefix. R is installed in the build prefix only when it is listed as a build requirement. RECIPE_DIR The directory where the recipe is located. SP_DIR The location of Python's site-packages, where Python libraries are installed. SRC_DIR The path to where the source code is unpacked or cloned. If the source file is not a recognized archive format, this directory contains a copy of the source file. STDLIB_DIR The location of Python's standard library. build_platform Represents the native subdirectory of the conda executable, indicating the platform for which the build is occurring.

Removed from conda-build are: - NPY_VER - PY3K

"},{"location":"build_script/#windows","title":"Windows","text":"

Unix-style packages on Windows are built in a special Library directory under the build prefix. The environment variables listed in the following table are defined only on Windows.

Variable Description LIBRARY_BIN <build prefix>\\Library\\bin. LIBRARY_INC <build prefix>\\Library\\include. LIBRARY_LIB <build prefix>\\Library\\lib. LIBRARY_PREFIX <build prefix>\\Library. SCRIPTS <build prefix>\\Scripts.

Not yet supported in rattler-build:

Additionally, the following variables are forwarded from the environment:

"},{"location":"build_script/#unix","title":"Unix","text":"

The environment variables listed in the following table are defined only on macOS and Linux.

Variable Description HOME Standard $HOME environment variable. PKG_CONFIG_PATH Path to pkgconfig directory, defaults to `$PREFIX/lib/pkgconfig SSL_CERT_FILE Path to SSL_CERT_FILE file. CFLAGS Empty, can be forwarded from env to set additional arguments to C compiler. CXXFLAGS Same as CFLAGS for C++ compiler. LDFLAGS Empty, additional flags to be passed to the linker when linking object files into an executable or shared object."},{"location":"build_script/#macos","title":"macOS","text":"

The environment variables listed in the following table are defined only on macOS.

Variable Description MACOSX_DEPLOYMENT_TARGET Same as the Anaconda Python macOS deployment target. Currently 10.9 for intel 32- and 64bit macOS, and 11.0 for arm64. OSX_ARCH i386 or x86_64 or arm64, depending on the target platform"},{"location":"build_script/#linux","title":"Linux","text":"

The environment variable listed in the following table is defined only on Linux.

Variable Description LD_RUN_PATH Defaults to <build prefix>/lib. QEMU_LD_PREFIX The prefix used by QEMU's user mode emulation for library paths. QEMU_UNAME Set qemu uname release string to 'uname'. DEJAGNU The path to the dejagnu testing framework used by the GCC test suite. DISPLAY The X11 display to use for graphical applications. BUILD Target triple ({build_arch}-conda_{build_distro}-linux-gnu) where build_distro is one of cos6 or cos7, for Centos 6 or 7"},{"location":"cli_usage/","title":"CLI usage","text":""},{"location":"cli_usage/#shell-completions","title":"Shell Completions","text":"

We support shell completions through clap_complete. You can generate them for your shell using the completion command.

You can add the completions to your shell by adding the following to your shell's configuration file:

# For bash (add this to ~/.bashrc)\neval \"$(rattler-build completion --shell=bash)\"\n# For zsh (add this to ~/.zshrc)\neval \"$(rattler-build completion --shell=zsh)\"\n# For fish (add this to ~/.config/fish/config.fish)\nrattler-build completion --shell=fish | source\n

Ensure that wherever you install rattler-build is pointed to by your PATH (for zsh or equivalent in other shells), after which point you can use TAB or any configured completion key of choice.

$ rattler-build <TAB>\nbuild    -- Build a package\nhelp     -- Print this message or the help of the given subcommand(s)\nrebuild  -- Rebuild a package\ntest     -- Test a package\n\n## Package format\n\nYou can specify the package format (either `.tar.bz2` or `.conda`) by using the `--package-format` flag.\nYou can also set the compression level with `:<level>` after the package format. The `<level>` can be `max`, `min`, `default` or a number corresponding to the compression level.\n`.tar.bz2` supports compression levels between `1` and `9` while `.conda` supports compression levels between `-7` and `22`.\nFor `.conda`, you can also set the `--compression-threads` flag to specify the number of threads to use for compression.\n\n```sh\n# default\nrattler-build build --package-format tarbz2 -r recipe/recipe.yaml\n# maximum compression with 10 threads\nrattler-build build --package-format conda:max --compression-threads 10 -r recipe/recipe.yaml\n
"},{"location":"cli_usage/#logs","title":"Logs","text":"

rattler-build knows three different log styles: fancy, plain, and json. You can configure them with the --log-style=<style> flag:

# default\nrattler-build build --log-style fancy -r recipe/recipe.yaml\n
"},{"location":"cli_usage/#github-integration","title":"GitHub integration","text":"

rattler-build also has a GitHub integration. With this integration, warnings are automatically emitted in the GitHub Actions log and a summary is generated and posted to the GitHub Actions summary page.

To make use of this integration, we recommend using our custom GitHub action: rattler-build-action. To manually enable it, you can set the environment variable RATTLER_BUILD_ENABLE_GITHUB_INTEGRATION=true.

"},{"location":"compilers/","title":"Compilers and cross-compilation","text":"

To use a compiler in your project, it's best to use the ${{ compiler('lang') }} template function. The compiler function works by taking a language, determining the configured compiler for that language, and adding some information about the target platform to the selected compiler. To configure a compiler for a specific language, the variant_config.yaml file can be used.

For example, in a recipe that uses a C-compiler, you can use the following code:

requirements:\n  build:\n    - ${{ compiler('c') }}\n

To set the compiler that you want to use, create a variant config that looks like the following:

c_compiler:\n  - gcc\n\n# optionally you can specify a version\nc_compiler_version:\n  - 9.3.0\n

When the template function is evaluated, it will look something like: gcc_linux-64 9.3.0. You can define your own compilers. For example, for Rust you can use ${{ compiler('rust') }} and rust_compiler_{version} in your variant config.

"},{"location":"compilers/#cross-compilation","title":"Cross-compilation","text":"

Cross-compilation is supported by rattler-build and the compiler template function is part of what makes it possible. When you want to cross-compile from linux-64 to linux-aarch64 (i.e. intel to ARM), you can pass --target-platform linux-aarch64 to the rattler-build command. This will cause the compiler template function to select a compiler that is configured for linux-aarch64. The above example would resolve to gcc_linux-aarch64 9.3.0. Provided that the package is available for linux-64 (your build platform), the compilation should succeed.

The distinction between the build and host sections begins to make sense when thinking about cross-compilation. The build environment is resolved to packages that need to run at compilation time. For example, cmake, gcc, and autotools are all tools that need to be executed. Therefore, the build environment resolves to packages for the linux-64 architecture (in our example). On the other hand, the host packages resolve to linux-aarch64 - those are packages that we want to link against.

# packages that need to run at build time (cmake, gcc, autotools, etc.)\n# in the platform that rattler-build is executed on (the build_platform)\nbuild:\n  - cmake\n  - ${{ compiler('c') }}\n# packages that we want to link against in the architecture we are\n# cross-compiling to the target_platform\nhost:\n  - libcurl\n  - openssl\n
"},{"location":"experimental_features/","title":"Experimental features","text":"

Warning

These are experimental features of rattler-build and may change or go away completely.

Currently only the build and rebuild commands support the following experimental features.

To enable them, use the --experimental flag with the command. Or, use the environment variable, RATTLER_BUILD_EXPERIMENTAL=1.

"},{"location":"experimental_features/#jinja-functions","title":"Jinja functions","text":""},{"location":"experimental_features/#load_from_filefile_path","title":"load_from_file(<file_path>)","text":"

The Jinja function load_from_file allows loading from files; specifically, it allows loading from toml, json, and yaml file types to an object to allow it to fetch things directly from the file. It loads all other files as strings.

"},{"location":"experimental_features/#usage","title":"Usage","text":"

load_from_file is useful when there is a project description in a well-defined project file such as Cargo.toml, package.json, pyproject.toml, package.yaml, or stack.yaml. It enables the recipe to be preserved in as simple a state as possible, especially when there is no need to keep the changes in sync; some example use cases for this are with CI/CD infrastructure or when there is a well-defined output format.

Below is an example loading a Cargo.toml inside of the rattler-build GitHub repository:

recipe.yaml
context:\n  name: ${{ load_from_file(\"Cargo.toml\").package.name }}\n  version: ${{ load_from_file(\"Cargo.toml\").package.version }}\n  source_url: ${{ load_from_file(\"Cargo.toml\").package.homepage }}\n  rust_toolchain: ${{ load_from_file(\"rust-toolchains\") }}\n\npackage:\n  name: ${{ name }}\n  version: ${{ version }}\n\nsource:\n  git: ${{ source_url }}\n  tag: ${{ source_tag }}}}\n\nrequirements:\n  build:\n    - rust ==${{ rust_toolchain }}\n\nbuild:\n  script: cargo build --release -p ${{ name }}\n\ntest:\n  - script: cargo test -p ${{ name }}\n  - script: cargo test -p rust-test -- --test-threads=1\n\nabout:\n  home: ${{ source_url }}\n  repository: ${{ source_url }}\n  documentation: ${{ load_from_file(\"Cargo.toml\").package.documentation }}\n  summary: ${{ load_from_file(\"Cargo.toml\").package.description }}\n  license: ${{ load_from_file(\"Cargo.toml\").package.license }}\n
"},{"location":"experimental_features/#git-functions","title":"git functions","text":"

git functions are useful for getting the latest tag and commit hash. These can be used in the context section of the recipe, to fetch version information from a repository.

Examples
# latest tag in the repo\ngit.latest_tag(<git_repo_url>)\n\n# latest tag revision(aka, hash of tag commit) in the repo\ngit.latest_tag_rev(<git_repo_url>)\n\n# latest commit revision(aka, hash of head commit) in the repo\ngit.head_rev(<git_repo_url>)\n
"},{"location":"experimental_features/#usage_1","title":"Usage","text":"

These can be useful for automating minor things inside of the recipe itself, such as if the current version is the latest version or if the current hash is the latest hash, etc.

recipe.yaml
context:\n  git_repo_url: \"https://github.com/prefix-dev/rattler-build\"\n  latest_tag: ${{ git.latest_tag( git_repo_url ) }}\n\npackage:\n  name: \"rattler-build\"\n  version: ${{ latest_tag }}\n\nsource:\n  git: ${{ git_repo_url }}\n  tag: ${{ latest_tag }}\n

There is currently no guarantee of caching for repo fetches when using git functions. This may lead to some performance issues.

"},{"location":"highlevel/","title":"What is rattler-build?","text":"

rattler-build is a tool to build and package software so that it can be installed on any operating system \u2013 with any compatible package manager such as mamba, conda, or rattler. We are also intending for rattler-build to be used as a library to drive builds of packages from any other recipe format in the future.

"},{"location":"highlevel/#how-does-rattler-build-work","title":"How does rattler-build work?","text":"

Building of packages consists of several steps. It all begins with a recipe.yaml file that specifies how the package is to be built and what the dependencies are. From the recipe file, rattler-build executes several steps:

  1. Rendering:

Parse the recipe file and evaluate conditionals, Jinja expressions, and variables, and variants.

  1. Fetch source:

Retrieve specified source files, such as .tar.gz files, git repositories, local paths. Additionally, this step will apply patches that can be specified alongside the source file.

  1. Install build environments:

Download and install dependencies into temporary \"host\" and \"build\" workspaces. Any dependencies that are needed at build time are installed in this step.

  1. Build source:

Execute the build script to build/compile the source code and install it into the host environment.

  1. Prepare package files:

Collect all files that are new in the \"host\" environment and apply some transformations if necessary; specifically, we edit the rpath on Linux and macOS to make binaries relocatable.

  1. Package:

Bundle all the files in a package and write out any additional metadata into the info/index.json, info/about.json, and info/paths.json files. This also creates the test files that are bundled with the package.

  1. Test:

Run any tests specified in the recipe. The package is considered done if it passes all the tests, otherwise its moved to broken/ in the output directory.

After this process, a package is created. This package can be uploaded to somewhere like a custom prefix.dev private or public channel.

"},{"location":"highlevel/#how-to-run-rattler-build","title":"How to run rattler-build","text":"

Running rattler-build is straightforward. It can be done on the command line:

rattler-build build --recipe myrecipe/recipe.yaml\n

A custom channel that is not conda-forge (the default) can be specified like so:

rattler-build build -c robostack --recipe myrecipe/recipe.yaml\n

You can also use the --recipe-dir argument if you want to build all the packages in a directory:

rattler-build build --recipe-dir myrecipes/\n
"},{"location":"highlevel/#overview-of-a-recipeyaml","title":"Overview of a recipe.yaml","text":"

A recipe.yaml file is separated into multiple sections and can conditionally include or exclude sections. Recipe files also support a limited amount of string interpolation with Jinja (specifically minijinja in our case).

A simple example of a recipe file for the zlib package would look as follows:

recipe.yaml
# variables from the context section can be used in the rest of the recipe\n# in jinja expressions\ncontext:\n  version: 1.2.13\n\npackage:\n  name: zlib\n  version: ${{ version }}\n\nsource:\n  url: http://zlib.net/zlib-${{ version }}.tar.gz\n  sha256: b3a24de97a8fdbc835b9833169501030b8977031bcb54b3b3ac13740f846ab30\n\nbuild:\n  # build numbers can be set arbitrarily\n  number: 0\n  script:\n    # build script to install the package into the $PREFIX (host prefix)\n    - if: unix\n      then:\n      - ./configure --prefix=$PREFIX\n      - make -j$CPU_COUNT\n    - if: win\n      then:\n      - cmake -G \"Ninja\" -DCMAKE_BUILD_TYPE=Release -DCMAKE_PREFIX_PATH=%LIBRARY_PREFIX%\n      - ninja install\n\nrequirements:\n  build:\n    # compiler is a special function.\n    - ${{ compiler(\"c\") }}\n    # The following two dependencies are only needed on Windows,\n    # and thus conditionally selected\n    - if: win\n      then:\n        - cmake\n        - ninja\n    - if: unix\n      then:\n        - make\n

The sections of a recipe are:

sections description context Defines variables that can be used in the Jinja context later in the recipe (e.g. name and version are commonly interpolated in strings) package This section defines the name and version of the package you are currently building and will be the name of the final output source Defines where the source code is going to be downloaded from and checksums build Settings for the build and the build script requirements Allows the definition of build, host, run and run-constrained dependencies"},{"location":"internals/","title":"Internals of rattler-build","text":""},{"location":"internals/#making-packages-relocatable-with-rattler-build","title":"Making Packages Relocatable with rattler-build","text":"

Often, the most challenging aspect of building a package using rattler-build is making it relocatable. A relocatable package can be installed into any prefix, allowing it to be used outside the environment in which it was built. This is in contrast to a non-relocatable package, which can only be utilized within its original build environment.

rattler-build automatically performs the following actions to make packages relocatable:

  1. Binary object file conversion: Binary object files are converted to use relative paths using install_name_tool on macOS and patchelf on Linux. This uses $ORIGIN for elf files on Linux and @loader_path for Mach-O files on macOS to make the rpath relative to the executable / shared library.
  2. Text file prefix registration: Any text file without NULL bytes containing the placeholder prefix have the registered prefix replaced with the install prefix.
  3. Binary file prefix detection and registration: Binary files containing the build prefix can be automatically registered. The registered files will have their build prefix replaced with the install prefix at install time. This works by padding the install prefix with null terminators, such that the length of the binary file remains the same. The build prefix must be long enough to accommodate any reasonable installation prefix. On macOS and Linux, rattler-build pads the build prefix to 255 characters by appending _placehold to the end of the build directory name.
"},{"location":"internals/#what-goes-into-a-package","title":"What goes into a package?","text":"

Generally speaking, any new files that are copied into the $PREFIX directory at build time are part of the new package. However, there is some filtering going on to exclude unwanted files, and noarch: python packages have special handling as well. The rules are as follows:

"},{"location":"internals/#filtering","title":"Filtering","text":""},{"location":"internals/#general-file-filtering","title":"General File Filtering","text":"

Certain files are filtered out to prevent them from being included in the package. These include:

"},{"location":"internals/#special-handling-for-noarch-python-packages","title":"Special Handling for noarch: python Packages","text":"

For packages marked as noarch: python, special transformations are applied to ensure compatibility across different platforms:

"},{"location":"internals/#symlink-handling","title":"Symlink Handling","text":"

Symlinks are carefully managed to ensure they are relative rather than absolute, which aids in making the package relocatable:

"},{"location":"package_spec/","title":"Package specification","text":"

rattler-build produces \"conda\" packages. These packages work with the mamba and conda package managers, and they work cross-platform on Windows, Linux and macOS.

By default, a conda package is a tar.bz2 archive which contains:

The format is identical across platforms and operating systems. During the install process, all files are extracted into the install prefix, except the ones in info/. Installing a conda package into an environment is similar to executing the following commands:

cd <environment prefix>\ntar xjf mypkg-1.0.0-h2134.tar.bz2\n

Only files, including symbolic links, are part of a conda package. Directories are not included. Directories are created and removed as needed, but you cannot create an empty directory from the tar archive directly.

There is also a newer archive type, suffixed with .conda. This archive type consists of an outer \"zip\" archive that is not compressed, and two inner archives that are compressed with zstd, which is very fast for decompression.

The inner archives are split into info and pkg files, which makes it possible to extract only the info part of the archive (only the metadata), which is often smaller in size.

"},{"location":"package_spec/#package-filename","title":"Package filename","text":"

A conda package conforms to the following filename:

<name>-<version>-<hash>.tar.bz2 OR <name>-<version>-<hash>.conda\n
"},{"location":"package_spec/#special-files-in-packages","title":"Special files in packages","text":"

There are some special files in a package:

You can read more about these files in the activation scripts and other special files section.

"},{"location":"package_spec/#package-metadata","title":"Package metadata","text":"

The info/ directory contains all metadata about a package. Files in this location are not installed under the install prefix. Although you are free to add any file to this directory, conda only inspects the content of the files discussed below:

"},{"location":"package_spec/#infoindexjson","title":"info/index.json","text":"

This file contains basic information about the package, such as name, version, build string, and dependencies. The content of this file is stored in repodata.json, which is the repository index file, hence the name index.json. The JSON object is a dictionary containing the keys shown below.

name: string

The lowercase name of the package. May contain lowercase characters, underscores, and dashes.

version: string

The package version. May not contain \"-\". Acknowledges PEP 440.

build: string

The build string. May not contain \"-\". Differentiates builds of packages with otherwise identical names and versions, such as:

build_number: integer

A non-negative integer representing the build number of the package. Unlike the build string, the build_number is inspected by conda. Conda uses it to sort packages that have otherwise identical names and versions to determine the latest one. This is important because new builds that contain bug fixes for the way a package is built may be added to a repository.

depends: list of match specs

A list of dependency specifications, where each element is a string. These come from the run section of the recipe or any run exports of dependencies.

constrains: list of match specs

A list of optional dependency constraints. The packages listed under constrains are not installed by default, but if they are installed they have to respect the constraints.

subdir: string

The subdir (like linux-64) of this package.

arch: string

Optional. The architecture the package is built for. EXAMPLE: x86_64. This key is generally not used (duplicate information from sudir).

platform: string

Optional. The OS that the package is built for, e.g. osx. This key is generally not used (duplicate information from sudir).

"},{"location":"package_spec/#infopathsjson","title":"info/paths.json","text":"

The paths.json file lists all files that are installed into the environment.

It consists of a list of path entries, each with the following keys:

_path: string

The relative path of the file

path_type: optional, string

The type of linking, can be hardlink, softlink, or directory. Default is hardlink.

file_mode: - optional, string

The file mode can be binary or text. This is only relevant for prefix replacement.

prefix_placeholder: optional, string

The prefix placeholder string that is encoded in the text or binary file, which is replaced at installation time. Note that this prefix placeholder uses / even on Windows.

no_link: bool, optional

Determines whether this file should be linked or not when installing the package (linking the file from the cache into the environment). Defaults to false.

sha256: string

The SHA256 hash of the file. For symbolic links it contains the SHA256 hash of the file pointed to.

size_in_bytes: number

The size, in bytes, of the file. For symbolic links, it contains the file size of the file pointed to.

Due to the way the binary replacement works, the placeholder prefix must be longer than the install prefix.

"},{"location":"package_spec/#infolicense","title":"info/license/<...>","text":"

All licenses mentioned in the recipe are copied to this folder.

"},{"location":"package_spec/#infoaboutjson","title":"info/about.json","text":"

Optional file. Contains the entries of the \"about\" section of the recipe of the recipe.yaml file. The following keys are added to info/about.json if present in the build recipe:

Renamed fields

The new recipe spec renamed a few fields (from conda-build's original implementation). This means that some fields in the about.json file still have the old names (for backwards compatibility), while you would generally use different names in the recipe.

home: url (from about.homepage)

The URL of the homepage of the package.

dev_url: url (from about.repository)

The URL of the development repository of the package.

doc_url: url (from about.documentation)

The URL of the documentation of the package.

license_url: url

The URL of the license of the package.

license: string (from about.license)

The SPDX license identifier of the package.

summary: string

A short summary of the package.

description: string

A longer description of the package.

license_family: string

(this field is not used anymore as we rely on SPDX license identifiers)

"},{"location":"package_spec/#inforecipe","title":"info/recipe/<...>","text":"

A directory containing the full contents of the build recipe. This folder also contains a rendered version of the recipe (rendered_recipe.yaml). This rendered version is used for the rebuild command. However, note that currently this format is still in flux and can change at any time.

You can also use --no-include-recipe to disable the inclusion of the recipe in the package.

"},{"location":"rebuild/","title":"Rebuilding a package","text":"

The rebuild command allows you to rebuild a package from an existing package. The main use case is to examine if a package can be rebuilt in a reproducible manner. You can read more about reproducible builds here.

"},{"location":"rebuild/#usage","title":"Usage","text":"
rattler-build rebuild ./mypkg-0.1.0-h60d57d3_0.tar.bz2\n
"},{"location":"rebuild/#how-it-works","title":"How it works","text":"

The recipe is \"rendered\" and stored into the package. The way the recipe is rendered is subject to change. For the moment, the rendered recipe is stored as info/recipe/rendered_recipe.yaml. It includes the exact package versions that were used at build time. When rebuilding, we use the package resolutions from the rendered recipe, and execute the same build script as the original package.

We also take great care to sort files in a deterministic manner as well as erasing any time stamps. The SOURCE_DATE_EPOCH environment variable is set to the same timestamp as the original build for additional determinism (some build tools use this variable to set timestamps).

"},{"location":"rebuild/#how-to-check-the-reproducibility-of-a-package","title":"How to check the reproducibility of a package","text":"

There is an excellent tool called diffoscope that allows you to compare two packages and see the differences. You can install it with pixi:

pixi global install diffoscope\n

To compare two packages, you can use the following command:

rattler-build rebuild ./build0.tar.bz2\ndiffoscope ./build0.tar.bz2 ./mypkg-0.1.0-h60d57d3_0.tar.bz2\n
"},{"location":"selectors/","title":"Selectors in recipes","text":"

Recipe and variant configuration files can utilize selectors to conditionally add, remove, or modify dependencies, configuration options, or even skip recipe execution based on specific conditions.

Selectors are implemented using a simple if / then / else map, which is a valid YAML dictionary. The condition is evaluated using minijinja and follows the same syntax as a Python expression.

During rendering, several variables are set based on the platform and variant being built. For example, the unix variable is true for macOS and Linux, while win is true for Windows. Consider the following recipe executed on Linux:

requirements:\n  host:\n    - if: unix\n      then: unix-tool\n    - if: win\n      then: win-tool\n

This will be evaluated as:

requirements:\n  host:\n    - unix-tool\n

The line containing the Windows-specific configuration is removed. Multiple items can also be selected, such as:

host:\n  - if: linux\n    then:\n    - linux-tool-1\n    - linux-tool-2\n    - linux-tool-3\n

For Linux, this will result in:

host:\n  - linux-tool-1\n  - linux-tool-2\n  - linux-tool-3\n

Other examples often found in the wild:

if: build_platform != target_platform ... # true if cross-platform build\nif: osx and arm64 ... # true for apple silicon (osx-arm64)\nif: linux and (aarch64 or ppc64le)) ... # true for linux ppc64le or linux-aarch64\n
"},{"location":"selectors/#available-variables","title":"Available variables","text":"

The following variables are available during the initial rendering and afterward:

Variable Description target_platform the configured target_platform for the build build_platform the build platform linux \"true\" if target_platform is Linux osx \"true\" if target_platform is OSX / macOS win \"true\" if target_platform is Windows unix \"true\" if target_platform is a Unix (macOS or Linux) x86_64, x86, arm64, ... The architecture (\"x86_64\" for 64 bit, \"x86\" for 32 bit, otherwise arm64, aarch64, ppc64le, etc.)

After the initial phase, when the variant configuration is selected, the variant values are also available in selectors. For example, if the build uses python: 3.8 as a variant, we can use if: python == \"3.8\" to enable a dependency for only when the Python version is 3.8.

"},{"location":"selectors/#the-cmp-function","title":"The cmp function","text":"

Inside selectors, one can use a special cmp function to test if the selected variant version has a matching version. For example, if we have again a python: 3.8 variant, we could use the following tests:

- if: cmp(python, \"3.8\")    # true\n  then: mydep\n- if: cmp(python, \">=3.8\")  # true\n  then: mydep\n- if: cmp(python, \"<3.8\")   # false\n  then: mydep\n

This function eliminates the need to implement any Python-specific conda-build selectors (such as py3k, py38, etc.) or the py and npy integers.

Please note that during the initial phase of rendering we do not know the variant, and thus the cmp condition always evaluates to true.

"},{"location":"special_files/","title":"Activation scripts and other special files","text":"

A conda package can contain \"special\" files in the prefix. These files are scripts that are executed during activation, installation, or uninstallation process.

If possible, they should be avoided since they execute arbitrary code at installation time and slow down the installation and activation process.

"},{"location":"special_files/#activation-scripts","title":"Activation scripts","text":"

The activation scripts are executed when the environment containing the package is activated (e.g. when doing micromamba activate myenv or pixi run ...).

The scripts are located in special folders:

The scripts are executed in lexicographical order, so you can prefix them with numbers to control the order of execution.

To add a script to the package, just make sure that you install the file in this folder. For example, on Linux:

mkdir -p $PREFIX/etc/conda/activate.d\ncp activate-mypkg.sh $PREFIX/etc/conda/activate.d/10-activate-mypkg.sh\n\nmkdir -p $PREFIX/etc/conda/deactivate.d\ncp deactivate-mypkg.sh $PREFIX/etc/conda/deactivate.d/10-deactivate-mypkg.sh\n
"},{"location":"special_files/#post-link-and-pre-unlink-scripts","title":"Post-link and pre-unlink scripts","text":"

The post-link and pre-unlink scripts are executed when the package is installed or uninstalled. They are both heavily discouraged and currently not implemented in rattler, rattler-build, and pixi.

To create a post-link script for your package, you need to add <package_name>-post-link.{sh/bat} to the bin/ folder of your package. The same is applicable for pre-unlink scripts, just with the name <package_name>-pre-unlink.{sh/bat}.

For example, for mypkg, you would add mypkg-post-link.sh to the bin/ folder of your package.

"},{"location":"testing/","title":"Testing packages","text":"

When you are developing a package, you should write tests for it. The tests are automatically executed right after the package build has finished.

The tests from the test section are actually packaged into your package and can also be executed straight from the existing package.

The idea behind adding the tests into the package is that you can execute the tests independently from building the package. That is also why we are shipping a test subcommand that takes as input an existing package and executes the tests:

rattler-build test --package-file ./xtensor-0.24.6-h60d57d3_0.tar.bz2\n

Running the above command will extract the package and create a clean environment where the package and dependencies are installed. Then the tests are executed in this newly-created environment.

If you inspect the package contents, you would find the test files under info/test/*.

"},{"location":"testing/#how-tests-are-translated","title":"How tests are translated","text":"

The tests section allows you to specify the following things:

tests:\n  - script:\n      # commands to run to test the package. If any of the commands\n      # returns with an error code, the test is considered failed.\n      - echo \"Hello world\"\n      - pytest ./tests\n\n    # additional requirements at test time\n    requirements:\n      run:\n        - pytest\n\n    files:\n      # Extra files to be copied to the test directory from the \"work directory\"\n      source:\n        - tests/\n        - test.py\n        - *.sh\n      recipe:\n        - more_tests/*.py\n\n  # This test section tries to import the Python modules and errors if it can't\n  - python:\n      imports:\n        - mypkg\n        - mypkg.subpkg\n

When you are writing a test for your package, additional files are created and added to your package. These files are placed under the info/tests/{index}/ folder for each test.

For a script test:

For a Python import test:

For a downstream test:

"},{"location":"testing/#legacy-tests","title":"Legacy tests","text":"

Legacy tests (from conda-build) are still supported for execution. These tests are stored as files under the info/test/ folder.

The files are:

Additionally, the info/test/ folder contains all the files specified in the test section as source_files and files. The tests are executed pointing to this directory as the current working directory.

"},{"location":"tui/","title":"Terminal User Interface","text":"

rattler-build offers a terminal user interface for building multiple packages and viewing the logs.

To launch the TUI, run the build command with the --tui flag as shown below:

$ rattler-build build -r recipe.yaml --tui\n

Note

rattler-build-tui is gated behind the tui feature flag to avoid extra dependencies. Build the project with --features tui arguments to enable the TUI functionality.

"},{"location":"tui/#key-bindings","title":"Key Bindings","text":"Key Action \u23ce Build a Build all j/k Next/previous package up/down/left/right Scroll logs e Edit recipe (via $EDITOR) c, : Open command prompt (available commands: edit) q, ctrl-c, esc, Quit"},{"location":"variants/","title":"Variant configuration","text":"

rattler-build can automatically build multiple variants of a given package. For example, a Python package might need multiple variants per Python version (especially if it is a binary package such as numpy).

For this use case, one can specify variant configuration files. A variant configuration file has 2 special entries and a list of packages with variants. For example:

variants.yaml
# special entry #1, the zip keys\nzip_keys:\n- [python, numpy]\n\n# special entry #2, the pin_run_as_build key\npin_run_as_build:\n  numpy:\n    max_pin: 'x.x'\n\n# entries per package version that users are interested in\npython:\n# Note that versions are _strings_ (not numbers)\n- \"3.8\"\n- \"3.9\"\n- \"3.10\"\n\nnumpy:\n- \"1.12\"\n- \"1.12\"\n- \"1.20\"\n

We can pass a variant configuration file to rattler-build using a command like the following:

rattler-build build --variant-config ./variants.yaml --recipe myrecipe.yaml\n

If we have a recipe, that has a build, host or run dependency on python we will build multiple variants of this package, one for each configured python version (\"3.8\", \"3.9\" and \"3.10\").

For example:

# ...\nrequirements:\n  host:\n  - python\n

... will be rendered as (for the first variant):

# ...\nrequirements:\n  host:\n- python 3.8*\n

Note that variants are only applied if the requirement doesn't specify any constraints. If the requirement would be python >3.8,<3.10 then the variant entry would be ignored.

"},{"location":"variants/#package-hash-from-variant","title":"Package hash from variant","text":"

You might have wondered what the role of the build string is. The build string is (if not explicitly set) computed from the variant configuration. It serves as a mechanism to discern different build configurations that produce a package with the same name and version.

The hash is computed by dumping all of the variant configuration values that are used by a given recipe into a JSON file, and then hashing that JSON file.

For example, in our python example, we would get a variant configuration file that looks something like:

{\n    \"python\": \"3.8\"\n}\n

This JSON string is then hashed with the MD5 hash algorithm, and produces the hash. For certain packages (such as Python packages) special rules exists, and the py<Major.Minor> version is prepended to the hash, so that the final hash would look something like py38h123123.

"},{"location":"variants/#zip-keys","title":"Zip keys","text":"

Zip keys modify how variants are combined. Usually, each variant key that has multiple entries is expanded to a build matrix. For example, if we have:

python: [\"3.8\", \"3.9\"]\nnumpy: [\"1.12\", \"1.14\"]\n

...then we obtain 4 variants for a recipe that uses both numpy and python:

- python 3.8, numpy 1.12\n- python 3.8, numpy 1.14\n- python 3.9, numpy 1.12\n- python 3.9, numpy 1.14\n

However, if we use the zip_keys and specify:

zip_keys: [\"python\", \"numpy\"]\npython: [\"3.8\", \"3.9\"]\nnumpy: [\"1.12\", \"1.14\"]\n

...then the versions are \"zipped up\" and we only get 2 variants. Note that both python and numpy need to specify the exact same number of versions to make this work.

The resulting variants with the zip applied are:

- python 3.8, numpy 1.12\n- python 3.9, numpy 1.14\n
"},{"location":"variants/#pin-run-as-build","title":"Pin run as build","text":"

The pin_run_as_build key allows the user to inject additional pins. Usually, the run_exports mechanism is used to specify constraints for runtime dependencies from build time dependencies, but pin_run_as_build offers a mechanism to override that if the package does not contain a run exports file.

For example:

pin_run_as_build:\n  libcurl:\n    min_pin: 'x'\n    max_pin: 'x'\n

If we now have a recipe that uses libcurl in the host and run dependencies like:

requirements:\n  host:\n  - libcurl\n  run:\n  - libcurl\n

During resolution, libcurl might be evaluated to libcurl 8.0.1 h13284. Our new runtime dependency then looks like:

requirements:\n  host:\n  - libcurl 8.0.1 h13284\n  run:\n  - libcurl >=8,<9\n
"},{"location":"variants/#prioritizing-variants","title":"Prioritizing variants","text":"

You might produce multiple variants for a package, but want to define a priority for a given variant. The variant with the highest priority would be the default package that is selected by the resolver.

There are two mechanisms to make this possible: mutex packages and the down_prioritize_variant option in the recipe.

"},{"location":"variants/#the-down_prioritize_variant-option","title":"The down_prioritize_variant option","text":"

Note

It is not always necessary to use the down_prioritize_variant option - only if the solver has no other way to prefer a given variant. For example, if you have a package that has multiple variants for different Python versions, the solver will automatically prefer the variant with the highest Python version.

The down_prioritize_variant option allows you to specify a variant that should be down-prioritized. For example:

recipe.yaml
build:\n  variant_config:\n    use_keys:\n      # use cuda from the variant config, e.g. to build multiple CUDA variants\n      - cuda\n    # this will down-prioritize the cuda variant versus other variants of the package\n    down_prioritize_variant: ${{ 1 if cuda else 0 }}\n
"},{"location":"variants/#mutex-packages","title":"Mutex packages","text":"

Another way to make sure the right variants are selected are \"mutex\" packages. A mutex package is a package that is mutually exclusive. We use the fact that only one package of a given name can be installed at a time (the solver has to choose).

A mutex package might be useful to make sure that all packages that depend on BLAS are compiled against the same BLAS implementation. The mutex package will serve the purpose that \"openblas\" and \"mkl\" can never be installed at the same time.

We could define a BLAS mutex package like this:

variant_config.yaml
blas_variant:\n  - \"openblas\"\n  - \"mkl\"\n

And then the recipe.yaml for the mutex package could look like this:

recipe.yaml
package:\n  name: blas_mutex\n  version: 1.0\n\nbuild:\n  string: ${{ blas_variant }}${{ hash }}_${{ build_number }}\n  variant_config:\n    # make sure that `openblas` is preferred over `mkl`\n    down_prioritize_variant: ${{ 1 if blas_variant == \"mkl\" else 0 }}\n

This will create two package: blas_mutex-1.0-openblas and blas_mutex-1.0-mkl. Only one of these packages can be installed at a time because they share the same name. The solver will then only select one of these two packages.

The blas package in turn should have a run_export for the blas_mutex package, so that any package that links against blas also has a dependency on the correct blas_mutex package:

recipe.yaml
package:\n  name: openblas\n  version: 1.0\n\nrequirements:\n  # any package depending on openblas should also depend on the correct blas_mutex package\n  run_export:\n    # Add a run export on _any_ version of the blas_mutex package whose build string starts with \"openblas\"\n    - blas_mutex * openblas*\n

Then the recipe of a package that wants to build two variants, one for openblas and one for mkl could look like this:

recipe.yaml
package:\n  name: fastnumerics\n  version: 1.0\n\nrequirements:\n  host:\n    # build against both openblas and mkl\n    - ${{ blas_variant }}\n  run:\n    # implicitly adds the correct blas_mutex package through run exports\n    # - blas_mutex * ${{ blas_variant }}*\n
"},{"location":"reference/cli/","title":"Command-Line Help for rattler-build","text":"

This document contains the help content for the rattler-build command-line program.

"},{"location":"reference/cli/#rattler-build","title":"rattler-build","text":"

Usage: rattler-build [OPTIONS] [COMMAND]

"},{"location":"reference/cli/#subcommands","title":"Subcommands:","text":""},{"location":"reference/cli/#options","title":"Options:","text":""},{"location":"reference/cli/#build","title":"build","text":"

Build a package from a recipe

Usage: rattler-build build [OPTIONS]

"},{"location":"reference/cli/#options_1","title":"Options:","text":""},{"location":"reference/cli/#modifying-result","title":"Modifying result","text":""},{"location":"reference/cli/#test","title":"test","text":"

Run a test for a single package

This creates a temporary directory, copies the package file into it, and then runs the indexing. It then creates a test environment that installs the package and any extra dependencies specified in the package test dependencies file.

With the activated test environment, the packaged test files are run:

These test files are written at \"package creation time\" and are part of the package.

Usage: rattler-build test [OPTIONS] --package-file <PACKAGE_FILE>

"},{"location":"reference/cli/#options_2","title":"Options:","text":""},{"location":"reference/cli/#modifying-result_1","title":"Modifying result","text":""},{"location":"reference/cli/#rebuild","title":"rebuild","text":"

Rebuild a package from a package file instead of a recipe

Usage: rattler-build rebuild [OPTIONS] --package-file <PACKAGE_FILE>

"},{"location":"reference/cli/#options_3","title":"Options:","text":""},{"location":"reference/cli/#modifying-result_2","title":"Modifying result","text":""},{"location":"reference/cli/#upload","title":"upload","text":"

Upload a package

Usage: rattler-build upload [OPTIONS] [PACKAGE_FILES]... <COMMAND>

"},{"location":"reference/cli/#subcommands_1","title":"Subcommands:","text":""},{"location":"reference/cli/#arguments","title":"Arguments:","text":""},{"location":"reference/cli/#options_4","title":"Options:","text":""},{"location":"reference/cli/#modifying-result_3","title":"Modifying result","text":""},{"location":"reference/cli/#quetz","title":"quetz","text":"

Upload to aQuetz server. Authentication is used from the keychain / auth-file

Usage: rattler-build upload quetz [OPTIONS] --url <URL> --channel <CHANNEL>

"},{"location":"reference/cli/#options_5","title":"Options:","text":""},{"location":"reference/cli/#artifactory","title":"artifactory","text":"

Options for uploading to a Artifactory channel. Authentication is used from the keychain / auth-file

Usage: rattler-build upload artifactory [OPTIONS] --url <URL> --channel <CHANNEL>

"},{"location":"reference/cli/#options_6","title":"Options:","text":""},{"location":"reference/cli/#prefix","title":"prefix","text":"

Options for uploading to a prefix.dev server. Authentication is used from the keychain / auth-file

Usage: rattler-build upload prefix [OPTIONS] --channel <CHANNEL>

"},{"location":"reference/cli/#options_7","title":"Options:","text":""},{"location":"reference/cli/#anaconda","title":"anaconda","text":"

Options for uploading to a Anaconda.org server

Usage: rattler-build upload anaconda [OPTIONS] --owner <OWNER>

"},{"location":"reference/cli/#options_8","title":"Options:","text":""},{"location":"reference/cli/#completion","title":"completion","text":"

Generate shell completion script

Usage: rattler-build completion [OPTIONS]

"},{"location":"reference/cli/#options_9","title":"Options:","text":""},{"location":"reference/cli/#generate-recipe","title":"generate-recipe","text":"

Generate a recipe from PyPI or CRAN

Usage: rattler-build generate-recipe <SOURCE> <PACKAGE>

"},{"location":"reference/cli/#arguments_1","title":"Arguments:","text":""},{"location":"reference/cli/#auth","title":"auth","text":"

Handle authentication to external channels

Usage: rattler-build auth <COMMAND>

"},{"location":"reference/cli/#subcommands_2","title":"Subcommands:","text":""},{"location":"reference/cli/#login","title":"login","text":"

Store authentication information for a given host

Usage: rattler-build auth login [OPTIONS] <HOST>

"},{"location":"reference/cli/#arguments_2","title":"Arguments:","text":""},{"location":"reference/cli/#options_10","title":"Options:","text":""},{"location":"reference/cli/#logout","title":"logout","text":"

Remove authentication information for a given host

Usage: rattler-build auth logout <HOST>

"},{"location":"reference/cli/#arguments_3","title":"Arguments:","text":"

This document was generated automatically by clap-markdown.

"},{"location":"reference/jinja/","title":"Jinja functions that can be used in the recipe","text":"

rattler-build comes with a couple of useful helpers that can be used in the recipe.

"},{"location":"reference/jinja/#functions","title":"Functions","text":""},{"location":"reference/jinja/#the-compiler-function","title":"The compiler function","text":"

The compiler function can be used to put together a compiler that works for the current platform and the compilation \"target_platform\". The syntax looks like: ${{ compiler('c') }} where 'c' signifies the programming language that is used.

This function evaluates to <compiler>_<target_platform> <compiler_version>. For example, when compiling on linux and to linux-64, this function evaluates to gcc_linux-64.

The values can be influenced by the variant_configuration. The <lang>_compiler and <lang>_compiler_version variables are the keys with influence. See below for an example:

"},{"location":"reference/jinja/#usage-in-a-recipe","title":"Usage in a recipe","text":"recipe.yaml
requirements:\n  build:\n    - ${{ compiler('c') }}\n

With a corresponding variant_configuration:

variant_configuration.yaml
c_compiler:\n- clang\nc_compiler_version:\n- 9.0\n

The variables shown above would select the clang compiler in version 9.0. Note that the final output will still contain the target_platform, so that the full compiler will read clang_linux-64 9.0 when compiling with --target-platform linux-64.

rattler-build defines some default compilers for the following languages (inherited from conda-build):

"},{"location":"reference/jinja/#the-stdlib-function","title":"The stdlib function","text":"

The stdlib function closely mirrors the compiler function. It can be used to put together a standard library that works for the current platform and the compilation \"target_platform\".

Usage: ${{ stdlib('c') }}

Results in <stdlib>_<target_platform> <stdlib_version>. And uses the variant variables <lang>_stdlib and <lang>_stdlib_version to influence the output.

"},{"location":"reference/jinja/#usage-in-a-recipe_1","title":"Usage in a recipe:","text":"recipe.yaml
requirements:\n  build:\n    # these are usually paired!\n    - ${{ compiler('c') }}\n    - ${{ stdlib('c') }}\n

With a corresponding variant_configuration:

variant_configuration.yaml
# these are the values `conda-forge` uses in their pinning file\n# found at https://github.com/conda-forge/conda-forge-pinning-feedstock/blob/main/recipe/conda_build_config.yaml\nc_stdlib:\n- sysroot\nc_stdlib_version:\n- 2.17\n
"},{"location":"reference/jinja/#the-pin-functions","title":"The pin functions","text":"

A pin is created based on the version input (from a subpackage or a package resolution).

The pin functions take the following five arguments:

These \"pins\" are applied to the version input to create the lower and upper bounds. For example, if the version is 3.10.5 with min_pin=\"x.x\", max_pin=\"x.x.x\", the lower bound will be 3.10 and the upper bound will be 3.10.6.0a0. The max_pin will increment the last selected segment of the version by 1, and append .0a0 to the end to prevent any alpha versions from being selected.

If the last segment of the version contains a letter (e.g. 9e or 1.1.1j), then incrementing the version will set that letter to a, e.g. 9e will become 10a, and 1.1.1j will become 1.1.2a. In this case, also no 0a0 is appended to the end.

Sometimes you want to strongly connect your outputs. This can be achieved with the following input:

To override the lower or upper bound with a hard-coded value, you can use the following input:

Both lower_bound and upper_bound expect a valid version string (e.g. 1.2.3).

"},{"location":"reference/jinja/#the-pin_subpackage-function","title":"The pin_subpackage function","text":""},{"location":"reference/jinja/#the-pin_compatible-function","title":"The pin_compatible function","text":"

The pin compatible function works exactly as the pin_subpackage function, but it pins the package in the run requirements based on the resolved package of the host or build section.

"},{"location":"reference/jinja/#the-cdt-function","title":"The cdt function","text":"

This function helps add Core Dependency Tree packages as dependencies by converting packages as required according to hard-coded logic. See below for an example of how this function can be used:

# on x86_64 system\ncdt('package-name') # outputs: package-name-cos6-x86_64\n# on aarch64 system\ncdt('package-name') # outputs: package-name-cos6-aarch64\n
"},{"location":"reference/jinja/#the-hash-variable","title":"The hash variable","text":""},{"location":"reference/jinja/#the-version_to_buildstring-function","title":"The version_to_buildstring function","text":""},{"location":"reference/jinja/#the-env-object","title":"The env object","text":"

You can use the env object to retrieve environment variables and forward them to your build script. There are two ways to do this:

You can also check for the existence of an environment variable:

"},{"location":"reference/jinja/#default-jinja-filters","title":"Default Jinja filters","text":"

The following Jinja filters are available: lower, upper, indexing into characters (e.g. https://myurl.com/{{ name[0] }}/{{ name | lower }}_${{ version }}.tar.gz).

Navigate to the Minijinja documentation for a list of all available built-in filters.

"},{"location":"reference/recipe_file/","title":"The recipe spec","text":"

rattler-build implements a new recipe spec, different from the traditional \"meta.yaml\" file used in conda-build. A recipe has to be stored as a recipe.yaml file.

"},{"location":"reference/recipe_file/#history","title":"History","text":"

A discussion was started on what a new recipe spec could or should look like. The fragments of this discussion can be found here.

The reason for a new spec are:

"},{"location":"reference/recipe_file/#major-differences-from-conda-build","title":"Major differences from conda-build","text":""},{"location":"reference/recipe_file/#spec","title":"Spec","text":"

The recipe spec has the following parts:

"},{"location":"reference/recipe_file/#spec-reference","title":"Spec reference","text":"

The spec is also made available through a JSON Schema (which is used for validation). The schema (and pydantic source file) can be found in this repository: recipe-format

To use with VSCode(yaml-plugin) and other IDEs:

Either start the document with the following line:

# yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\n
Or, using yaml.schemas,
yaml.schemas: {\n  \"https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\": \"**/recipe.yaml\",\n}\n
Read more about this here.

See more in the automatic linting chapter.

"},{"location":"reference/recipe_file/#examples","title":"Examples","text":"recipe.yaml
# this sets up \"context variables\" (in this case name and version) that\n# can later be used in Jinja expressions\ncontext:\n  version: 1.1.0\n  name: imagesize\n\n# top level package information (name and version)\npackage:\n  name: ${{ name }}\n  version: ${{ version }}\n\n# location to get the source from\nsource:\n  url: https://pypi.io/packages/source/${{ name[0] }}/${{ name }}/${{ name }}-${{ version }}.tar.gz\n  sha256: f3832918bc3c66617f92e35f5d70729187676313caa60c187eb0f28b8fe5e3b5\n\n# build number (should be incremented if a new build is made, but version is not incrementing)\nbuild:\n  number: 1\n  script: python -m pip install --no-deps --ignore-installed .\n\n# the requirements at build and runtime\nrequirements:\n  host:\n    - python\n    - pip\n  run:\n    - python\n\n# tests to validate that the package works as expected\ntests:\n  - python:\n      imports:\n        - imagesize\n\n# information about the package\nabout:\n  homepage: https://github.com/shibukawa/imagesize_py\n  license: MIT\n  summary: 'Getting image size from png/jpeg/jpeg2000/gif file'\n  description: |\n    This module analyzes jpeg/jpeg2000/png/gif image header and\n    return image size.\n  repository: https://github.com/shibukawa/imagesize_py\n  documentation: https://pypi.python.org/pypi/imagesize\n\n# the below is conda-forge specific!\nextra:\n  recipe-maintainers:\n    - somemaintainer\n
"},{"location":"reference/recipe_file/#package-section","title":"Package section","text":"

Specifies package information.

package:\n  name: bsdiff4\n  version: \"2.1.4\"\n
"},{"location":"reference/recipe_file/#source-section","title":"Source section","text":"

Specifies where the source code of the package is coming from. The source may come from a tarball file, git, hg, or svn. It may be a local path and it may contain patches.

"},{"location":"reference/recipe_file/#source-from-tarball-or-zip-archive","title":"Source from tarball or zip archive","text":"
source:\n  url: https://pypi.python.org/packages/source/b/bsdiff4/bsdiff4-1.1.4.tar.gz\n  md5: 29f6089290505fc1a852e176bd276c43\n  sha1: f0a2c9a30073449cfb7d171c57552f3109d93894\n  sha256: 5a022ff4c1d1de87232b1c70bde50afbb98212fd246be4a867d8737173cf1f8f\n

If an extracted archive contains only 1 folder at its top level, its contents will be moved 1 level up, so that the extracted package contents sit in the root of the work folder.

"},{"location":"reference/recipe_file/#source-from-git","title":"Source from git","text":"
source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  # branch: master # note: defaults to fetching the repo's default branch\n

You can use rev to pin the commit version directly:

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  rev: \"50a1f7ed6c168eb0815d424cba2df62790f168f0\"\n

Or you can use the tag:

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  tag: \"1.1.4\"\n

git can also be a relative path to the recipe directory:

source:\n  git: ../../bsdiff4/.git\n  tag: \"1.1.4\"\n

Futhermore, if you want to fetch just the current \"HEAD\" (this may result in non-deterministic builds), then you can use depth.

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  depth: 1 # note: the behaviour defaults to -1\n

Note: tag or rev may not be available within commit depth range, hence we don't allow using rev or the tag and depth of them together if not set to -1.

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  tag: \"1.1.4\"\n  depth: 1 # error: use of `depth` with `rev` is invalid, they are mutually exclusive\n

When you want to use git-lfs, you need to set lfs: true. This will also pull the lfs files from the repository.

source:\n  git: ../../bsdiff4/.git\n  tag: \"1.1.4\"\n  lfs: true # note: defaults to false\n
"},{"location":"reference/recipe_file/#source-from-a-local-path","title":"Source from a local path","text":"

If the path is relative, it is taken relative to the recipe directory. The source is copied to the work directory before building.

  source:\n    path: ../src\n    use_gitignore: false # note: defaults to true\n

By default, all files in the local path that are ignored by git are also ignored by rattler-build. You can disable this behavior by setting use_gitignore to false.

"},{"location":"reference/recipe_file/#patches","title":"Patches","text":"

Patches may optionally be applied to the source.

  source:\n    #[source information here]\n    patches:\n      - my.patch # the patch file is expected to be found in the recipe\n
"},{"location":"reference/recipe_file/#destination-path","title":"Destination path","text":"

Within rattler-build's work directory, you may specify a particular folder to place the source into. rattler-build will always drop you into the same folder ([build folder]/work), but it's up to you whether you want your source extracted into that folder, or nested deeper. This feature is particularly useful when dealing with multiple sources, but can apply to recipes with single sources as well.

source:\n  #[source information here]\n  target_directory: my-destination/folder\n
"},{"location":"reference/recipe_file/#source-from-multiple-sources","title":"Source from multiple sources","text":"

Some software is most easily built by aggregating several pieces.

The syntax is a list of source dictionaries. Each member of this list follows the same rules as the single source. All features for each member are supported.

Example:

source:\n  - url: https://package1.com/a.tar.bz2\n    target_directory: stuff\n  - url: https://package1.com/b.tar.bz2\n    target_directory: stuff\n  - git: https://github.com/mamba-org/boa\n    target_directory: boa\n

Here, the two URL tarballs will go into one folder, and the git repo is checked out into its own space. git will not clone into a non-empty folder.

"},{"location":"reference/recipe_file/#build-section","title":"Build section","text":"

Specifies build information.

Each field that expects a path can also handle a glob pattern. The matching is performed from the top of the build environment, so to match files inside your project you can use a pattern similar to the following one: \"**/myproject/**/*.txt\". This pattern will match any .txt file found in your project. Quotation marks (\"\") are required for patterns that start with a *.

Recursive globbing using ** is also supported.

"},{"location":"reference/recipe_file/#build-number-and-string","title":"Build number and string","text":"

The build number should be incremented for new builds of the same version. The number defaults to 0. The build string cannot contain \"-\". The string defaults to the default rattler-build build string plus the build number.

build:\n  number: 1\n  string: abc\n
"},{"location":"reference/recipe_file/#dynamic-linking","title":"Dynamic linking","text":"

This section contains settings for the shared libraries and executables.

build:\n  dynamic_linking:\n    rpath_allowlist: [\"/usr/lib/**\"]\n
"},{"location":"reference/recipe_file/#python-entry-points","title":"Python entry points","text":"

The following example creates a Python entry point named \"bsdiff4\" that calls bsdiff4.cli.main_bsdiff4().

build:\n  python:\n    entry_points:\n      - bsdiff4 = bsdiff4.cli:main_bsdiff4\n      - bspatch4 = bsdiff4.cli:main_bspatch4\n
"},{"location":"reference/recipe_file/#script","title":"Script","text":"

By default, rattler-build uses a build.sh file on Unix (macOS and Linux) and a build.bat file on Windows, if they exist in the same folder as the recipe.yaml file. With the script parameter you can either supply a different filename or write out short build scripts. You may need to use selectors to use different scripts for different platforms.

build:\n  # A very simple build script\n  script: pip install .\n\n  # The build script can also be a list\n  script:\n    - pip install .\n    - echo \"hello world\"\n    - if: unix\n      then:\n        - echo \"unix\"\n
"},{"location":"reference/recipe_file/#skipping-builds","title":"Skipping builds","text":"

Lists conditions under which rattler-build should skip the build of this recipe. Particularly useful for defining recipes that are platform-specific. By default, a build is never skipped.

build:\n  skip:\n    - win\n    ...\n
"},{"location":"reference/recipe_file/#architecture-independent-packages","title":"Architecture-independent packages","text":"

Allows you to specify \"no architecture\" when building a package, thus making it compatible with all platforms and architectures. Architecture-independent packages can be installed on any platform.

Assigning the noarch key as generic tells conda to not try any manipulation of the contents.

build:\n  noarch: generic\n

noarch: generic is most useful for packages such as static JavaScript assets and source archives. For pure Python packages that can run on any Python version, you can use the noarch: python value instead:

build:\n  noarch: python\n

Note

At the time of this writing, noarch packages should not make use of preprocess-selectors: noarch packages are built with the directives which evaluate to true in the platform it is built on, which probably will result in incorrect/incomplete installation in other platforms.

"},{"location":"reference/recipe_file/#include-build-recipe","title":"Include build recipe","text":"

The recipe and rendered recipe.yaml file are included in the package_metadata by default. You can disable this by passing --no-include-recipe on the command line.

Note

There are many more options in the build section. These additional options control how variants are computed, prefix replacements, and more. See the full build options for more information.

"},{"location":"reference/recipe_file/#requirements-section","title":"Requirements section","text":"

Specifies the build and runtime requirements. Dependencies of these requirements are included automatically.

Versions for requirements must follow the conda/mamba match specification. See build-version-spec.

"},{"location":"reference/recipe_file/#build","title":"Build","text":"

Tools required to build the package.

These packages are run on the build system and include things such as version control systems (git, svn) make tools (GNU make, Autotool, CMake) and compilers (real cross, pseudo-cross, or native when not cross-compiling), and any source pre-processors.

Packages which provide \"sysroot\" files, like the CDT packages (see below), also belong in the build section.

requirements:\n  build:\n    - git\n    - cmake\n
"},{"location":"reference/recipe_file/#host","title":"Host","text":"

Represents packages that need to be specific to the target platform when the target platform is not necessarily the same as the native build platform. For example, in order for a recipe to be \"cross-capable\", shared libraries requirements must be listed in the host section, rather than the build section, so that the shared libraries that get linked are ones for the target platform, rather than the native build platform. You should also include the base interpreter for packages that need one. In other words, a Python package would list python here and an R package would list mro-base or r-base.

requirements:\n  build:\n    - ${{ compiler('c') }}\n    - if: linux\n      then:\n        - ${{ cdt('xorg-x11-proto-devel') }}\n  host:\n    - python\n

Note

When both \"build\" and \"host\" sections are defined, the build section can be thought of as \"build tools\" - things that run on the native platform, but output results for the target platform (e.g. a cross-compiler that runs on linux-64, but targets linux-armv7).

The PREFIX environment variable points to the host prefix. With respect to activation during builds, both the host and build environments are activated. The build prefix is activated before the host prefix so that the host prefix has priority over the build prefix. Executables that don't exist in the host prefix should be found in the build prefix.

The build and host prefixes are always separate when both are defined, or when ${{ compiler() }} Jinja2 functions are used. The only time that build and host are merged is when the host section is absent, and no ${{ compiler() }} Jinja2 functions are used in meta.yaml.

"},{"location":"reference/recipe_file/#run","title":"Run","text":"

Packages required to run the package.

These are the dependencies that are installed automatically whenever the package is installed. Package names should follow the package match specifications.

requirements:\n  run:\n    - python\n    - six >=1.8.0\n

To build a recipe against different versions of NumPy and ensure that each version is part of the package dependencies, list numpy as a requirement in recipe.yaml and use a conda_build_config.yaml file with multiple NumPy versions.

"},{"location":"reference/recipe_file/#run-constraints","title":"Run constraints","text":"

Packages that are optional at runtime but must obey the supplied additional constraint if they are installed.

Package names should follow the package match specifications.

requirements:\n  run_constraints:\n    - optional-subpackage ==${{ version }}\n

For example, let's say we have an environment that has package \"a\" installed at version 1.0. If we install package \"b\" that has a run_constraints entry of \"a >1.0\", then mamba would need to upgrade \"a\" in the environment in order to install \"b\".

This is especially useful in the context of virtual packages, where the run_constraints dependency is not a package that mamba manages, but rather a virtual package that represents a system property that mamba can't change. For example, a package on Linux may impose a run_constraints dependency on __glibc >=2.12. This is the version bound consistent with CentOS 6. Software built against glibc 2.12 will be compatible with CentOS 6. This run_constraints dependency helps mamba, conda or pixi tell the user that a given package can't be installed if their system glibc version is too old.

"},{"location":"reference/recipe_file/#run-exports","title":"Run exports","text":"

Packages may have runtime requirements such as shared libraries (e.g. zlib), which are required for linking at build time, and for resolving the link at run time. Such packages use run_exports for defining the runtime requirements to let the dependent packages understand the runtime requirements of the package.

Example from zlib:

  requirements:\n    run_exports:\n      - ${{ pin_subpackage('libzlib', exact=True) }}\n

Run exports are weak by default. But you can also define strong run_exports.

  requirements:\n    run_exports:\n      strong:\n        - ${{ pin_subpackage('libzlib', exact=True) }}\n
"},{"location":"reference/recipe_file/#ignore-run-exports","title":"Ignore run exports","text":"

There maybe cases where an upstream package has a problematic run_exports constraint. You can ignore it in your recipe by listing the upstream package name in the ignore_run_exports section in requirements.

You can ignore them by package name, or by naming the runtime dependency directly.

  requirements:\n    ignore_run_exports:\n      from_package:\n        - zlib\n

Using a runtime depenedency name:

  requirements:\n    ignore_run_exports:\n      from_name:\n        - libzlib\n

Note

ignore_run_exports only applies to runtime dependencies coming from an upstream package.

"},{"location":"reference/recipe_file/#tests-section","title":"Tests section","text":"

rattler-build supports four different types of tests. The \"script test\" installs the package and runs a list of commands. The \"Python test\" attempts to import a list of Python modules and runs pip check. The \"downstream test\" runs the tests of a downstream package that reverse depends on the package being built. And lastly, the \"package content test\" checks if the built package contains the mentioned items.

The tests section is a list of these items:

tests:\n  - script:\n      - echo \"hello world\"\n    requirements:\n      run:\n        - pytest\n    files:\n      source:\n        - test-data.txt\n\n  - python:\n      imports:\n        - bsdiff4\n      pip_check: true  # this is the default\n  - downstream: numpy\n
"},{"location":"reference/recipe_file/#script-test","title":"Script test","text":"

The script test has 3 top-level keys: script, files and requirements. Only the script key is required.

"},{"location":"reference/recipe_file/#test-commands","title":"Test commands","text":"

Commands that are run as part of the test.

tests:\n  - script:\n      - echo \"hello world\"\n      - bsdiff4 -h\n      - bspatch4 -h\n
"},{"location":"reference/recipe_file/#extra-test-files","title":"Extra test files","text":"

Test files that are copied from the source work directory into the temporary test directory and are needed during testing (note that the source work directory is otherwise not available at all during testing).

You can also include files that come from the recipe folder. They are copied into the test directory as well.

At test execution time, the test directory is the current working directory.

tests:\n  - script:\n      - ls\n    files:\n      source:\n        - myfile.txt\n        - tests/\n        - some/directory/pattern*.sh\n      recipe:\n        - extra-file.txt\n
"},{"location":"reference/recipe_file/#test-requirements","title":"Test requirements","text":"

In addition to the runtime requirements, you can specify requirements needed during testing. The runtime requirements that you specified in the \"run\" section described above are automatically included during testing (because the built package is installed as it regularly would be).

In the build section you can specify additional requirements that are only needed on the build system for cross-compilation (e.g. emulators or compilers).

tests:\n  - script:\n      - echo \"hello world\"\n    requirements:\n      build:\n        - myemulator\n      run:\n        - nose\n
"},{"location":"reference/recipe_file/#python-tests","title":"Python tests","text":"

For this test type you can list a set of Python modules that need to be importable. The test will fail if any of the modules cannot be imported.

The test will also automatically run pip check to check for any broken dependencies. This can be disabled by setting pip_check: false in the YAML.

tests:\n  - python:\n      imports:\n        - bsdiff4\n        - bspatch4\n      pip_check: true  # can be left out because this is the default\n

Internally this will write a small Python script that imports the modules:

import bsdiff4\nimport bspatch4\n
"},{"location":"reference/recipe_file/#check-for-package-contents","title":"Check for package contents","text":"

Checks if the built package contains the mentioned items. These checks are executed directly at the end of the build process to make sure that all expected files are present in the package.

tests:\n  - package_contents:\n      # checks for the existence of files inside $PREFIX or %PREFIX%\n      # or, checks that there is at least one file matching the specified `glob`\n      # pattern inside the prefix\n      files:\n        - etc/libmamba/test.txt\n        - etc/libmamba\n        - etc/libmamba/*.mamba.txt\n\n      # checks for the existence of `mamba/api/__init__.py` inside of the\n      # Python site-packages directory (note: also see Python import checks)\n      site_packages:\n        - mamba.api\n\n\n      # looks in $PREFIX/bin/mamba for unix and %PREFIX%\\Library\\bin\\mamba.exe on Windows\n      # note: also check the `commands` and execute something like `mamba --help` to make\n      # sure things work fine\n      bin:\n        - mamba\n\n      # searches for `$PREFIX/lib/libmamba.so` or `$PREFIX/lib/libmamba.dylib` on Linux or macOS,\n      # on Windows for %PREFIX%\\Library\\lib\\mamba.dll & %PREFIX%\\Library\\bin\\mamba.bin\n      lib:\n        - mamba\n\n      # searches for `$PREFIX/include/libmamba/mamba.hpp` on unix, and\n      # on Windows for `%PREFIX%\\Library\\include\\libmamba\\mamba.hpp`\n      include:\n        - libmamba/mamba.hpp\n
"},{"location":"reference/recipe_file/#downstream-tests","title":"Downstream tests","text":"

Warning

Downstream tests are not yet implemented in rattler-build.

A downstream test can mention a single package that has a dependency on the package being built. The test will install the package and run the tests of the downstream package with our current package as a dependency.

Sometimes downstream packages do not resolve. In this case, the test is ignored.

tests:\n  - downstream: numpy\n
"},{"location":"reference/recipe_file/#outputs-section","title":"Outputs section","text":"

Explicitly specifies packaging steps. This section supports multiple outputs, as well as different package output types. The format is a list of mappings.

When using multiple outputs, certain top-level keys are \"forbidden\": package and requirements. Instead of package, a top-level recipe key can be defined. The recipe.name is ignored but the recipe.version key is used as default version for each output. Other \"top-level\" keys are merged into each output (e.g. the about section) to avoid repetition. Each output is a complete recipe, and can have its own build, requirements, and test sections.

recipe:\n  # the recipe name is ignored\n  name: some\n  version: 1.0\n\noutputs:\n  - package:\n      # version is taken from recipe.version (1.0)\n      name: some-subpackage\n\n  - package:\n      name: some-other-subpackage\n      version: 2.0\n

Each output acts like an independent recipe and can have their own script, build_number, and so on.

outputs:\n  - package:\n      name: subpackage-name\n    build:\n      script: install-subpackage.sh\n

Each output is built independently. You should take care of not packaging the same files twice.

"},{"location":"reference/recipe_file/#subpackage-requirements","title":"Subpackage requirements","text":"

Like a top-level recipe, a subpackage may have zero or more dependencies listed as build, host or run requirements.

The dependencies listed as subpackage build requirements are available only during the packaging phase of that subpackage.

outputs:\n  - package:\n      name: subpackage-name\n    requirements:\n      build:\n        - some-dep\n      run:\n        - some-dep\n

You can also use the pin_subpackage function to pin another output from the same recipe.

outputs:\n  - package:\n      name: libtest\n  - package:\n      name: test\n    requirements:\n      build:\n        - ${{ pin_subpackage('libtest', max_pin='x.x') }}\n

The outputs are topologically sorted by the dependency graph which is taking the pin_subpackage invocations into account. When using pin_subpackage(name, exact=True) a special behavior is used where the name package is injected as a \"variant\" and the variant matrix is expanded appropriately. For example, when you have the following situation, with a variant_config.yaml file that contains openssl: [1, 3]:

outputs:\n  - package:\n      name: libtest\n    requirements:\n      host:\n        - openssl\n  - package:\n      name: test\n    requirements:\n      build:\n        - ${{ pin_subpackage('libtest', exact=True) }}\n

Due to the variant config file, this will build two versions of libtest. We will also build two versions of test, one that depends on libtest (openssl 1) and one that depends on libtest (openssl 3).

"},{"location":"reference/recipe_file/#about-section","title":"About section","text":"

Specifies identifying information about the package. The information displays in the package server.

about:\n  homepage: https://example.com/bsdiff4\n  license: BSD-3-Clause # (1)!\n  license_file: LICENSE\n  summary: binary diff and patch using the BSDIFF4-format\n  description: |\n    Long description of bsdiff4 ...\n  repository: https://github.com/ilanschnell/bsdiff4\n  documentation: https://docs.com\n
  1. Only the SPDX specifiers are allowed, more info here: SPDX If you want another license type LicenseRef-<YOUR-LICENSE> can be used, e.g. license: LicenseRef-Proprietary
"},{"location":"reference/recipe_file/#license-file","title":"License file","text":"

Adds a file containing the software license to the package metadata. Many licenses require the license statement to be distributed with the package. The filename is relative to the source or recipe directory. The value can be a single filename or a YAML list for multiple license files. Values can also point to directories with license information. Directory entries must end with a / suffix (this is to lessen unintentional inclusion of non-license files; all the directory's contents will be unconditionally and recursively added).

about:\n  license_file:\n    - LICENSE\n    - vendor-licenses/\n
"},{"location":"reference/recipe_file/#extra-section","title":"Extra section","text":"

A schema-free area for storing non-conda-specific metadata in standard YAML form.

Example: To store recipe maintainers information
extra:\n  maintainers:\n   - name of maintainer\n
"},{"location":"reference/recipe_file/#templating-with-jinja","title":"Templating with Jinja","text":"

rattler-build supports limited Jinja templating in the recipe.yaml file.

You can set up Jinja variables in the context section:

context:\n  name: \"test\"\n  version: \"5.1.2\"\n  # later keys can reference previous keys\n  # and use jinja functions to compute new values\n  major_version: ${{ version.split('.')[0] }}\n

Later in your recipe.yaml you can use these values in string interpolation with Jinja:

source:\n  url: https://github.com/mamba-org/${{ name }}/v${{ version }}.tar.gz\n

Jinja has built-in support for some common string manipulations.

In rattler-build, complex Jinja is completely disallowed as we try to produce YAML that is valid at all times. So you should not use any {% if ... %} or similar Jinja constructs that produce invalid YAML. Furthermore, instead of plain double curly brackets Jinja statements need to be prefixed by $, e.g. ${{ ... }}:

package:\n  name: {{ name }}   # WRONG: invalid yaml\n  name: ${{ name }} # correct\n

For more information, see the Jinja template documentation and the list of available environment variables env-vars.

Jinja templates are evaluated during the build process.

"},{"location":"reference/recipe_file/#additional-jinja2-functionality-in-rattler-build","title":"Additional Jinja2 functionality in rattler-build","text":"

Besides the default Jinja2 functionality, additional Jinja functions are available during the rattler-build process: pin_compatible, pin_subpackage, and compiler.

The compiler function takes c, cxx, fortran and other values as argument and automatically selects the right (cross-)compiler for the target platform.

build:\n  - ${{ compiler('c') }}\n

The pin_subpackage function pins another package produced by the recipe with the supplied parameters.

Similarly, the pin_compatible function will pin a package according to the specified rules.

"},{"location":"reference/recipe_file/#pin-expressions","title":"Pin expressions","text":"

rattler-build knows pin expressions. A pin expression can have a min_pin, max_pin and exact value. A max_pin and min_pin are specified with a string containing only x and ., e.g. max_pin=\"x.x.x\" would signify to pin the given package to <1.2.3 (if the package version is 1.2.2, for example).

A pin with min_pin=\"x.x\",max_pin=\"x.x\" for a package of version 1.2.2 would evaluate to >=1.2.2,<1.2.3.

If exact=true, then the hash is included, and the package is pinned exactly, e.g. ==1.2.2 h1234. This is a unique package variant that cannot exist more than once, and thus is \"exactly\" pinned.

"},{"location":"reference/recipe_file/#pin-subpackage","title":"Pin subpackage","text":"

Pin subpackage refers to another package from the same recipe file. It is commonly used in the build/run_exports section to export a run export from the package, or with multiple outputs to refer to a previous build.

It looks something like:

package:\n  name: mypkg\n  version: \"1.2.3\"\n\nrequirements:\n  run_exports:\n    # this will evaluate to `mypkg <1.3`\n    - ${{ pin_subpackage(name, max_pin='x.x') }}\n
"},{"location":"reference/recipe_file/#pin-compatible","title":"Pin compatible","text":"

Pin compatible lets you pin a package based on the version retrieved from the variant file (if the pinning from the variant file needs customization).

For example, if the variant specifies a pin for numpy: 1.11, one can use pin_compatible to relax it:

requirements:\n  host:\n    # this will select nupy 1.11\n    - numpy\n  run:\n    # this will export `numpy >=1.11,<2`, instead of the stricter `1.11` pin\n    - ${{ pin_compatible('numpy', min_pin='x.x', max_pin='x') }}\n
"},{"location":"reference/recipe_file/#the-env-jinja-functions","title":"The env Jinja functions","text":"

You can access the current environment variables using the env object in Jinja.

There are three functions:

This can be used for some light templating, for example:

build:\n  string: ${{ env.get(\"GIT_BUILD_STRING\") }}_${{ PKG_HASH }}\n
"},{"location":"reference/recipe_file/#cmp-function","title":"cmp function","text":"

This function matches the first argument (the package's MatchSpec) against the second argument (the version spec) and returns the resulting boolean.

cmp(python, '>=3.4')\n

Example: cmp usage example

"},{"location":"reference/recipe_file/#cdt-function","title":"cdt function","text":"

This function helps add Core Dependency Tree packages as dependencies by converting packages as required according to hard-coded logic.

# on x86_64 system\ncdt('package-name') # outputs: package-name-cos6-x86_64\n# on aarch64 system\ncdt('package-name') # outputs: package-name-cos6-aarch64\n

Example: cdt usage example

"},{"location":"reference/recipe_file/#preprocessing-selectors","title":"Preprocessing selectors","text":"

You can add selectors to any item, and the selector is evaluated in a preprocessing stage. If a selector evaluates to true, the item is flattened into the parent element. If a selector evaluates to false, the item is removed.

Selectors can use if ... then ... else as follows:

source:\n  - if: not win\n    then:\n      - url: http://path/to/unix/source\n    else:\n      - url: http://path/to/windows/source\n\n# or the equivalent with two if conditions:\n\nsource:\n  - if: unix\n    then:\n      - url: http://path/to/unix/source\n  - if: win\n    then:\n      - url: http://path/to/windows/source\n

A selector is a valid Python statement that is executed. You can read more about them in the \"Selectors in recipes\" chapter.

The use of the Python version selectors, py27, py34, etc. is discouraged in favor of the more general comparison operators. Additional selectors in this series will not be added to conda-build.

Because the selector is any valid Python expression, complicated logic is possible:

- if: unix and not win\n  then: ...\n- if: (win or linux) and not py27\n  then: ...\n

Lists are automatically \"merged\" upwards, so it is possible to group multiple items under a single selector:

tests:\n  - script:\n    - if: unix\n      then:\n      - test -d ${PREFIX}/include/xtensor\n      - test -f ${PREFIX}/lib/cmake/xtensor/xtensorConfigVersion.cmake\n    - if: win\n      then:\n      - if not exist %LIBRARY_PREFIX%\\include\\xtensor\\xarray.hpp (exit 1)\n      - if not exist %LIBRARY_PREFIX%\\lib\\cmake\\xtensor\\xtensorConfigVersion.cmake (exit 1)\n\n# On unix this is rendered to:\ntests:\n  - script:\n    - test -d ${PREFIX}/include/xtensor\n    - test -f ${PREFIX}/lib/cmake/xtensor/xtensorConfigVersion.cmake\n
"},{"location":"reference/recipe_file/#experimental-features","title":"Experimental features","text":"

Warning

These are experimental features of rattler-build and may change or go away completely.

"},{"location":"reference/recipe_file/#jinja-functions","title":"Jinja functions","text":""},{"location":"tutorials/cpp/","title":"Packaging a C++ package","text":"

This tutorial will guide you though making a C++ package with rattler-build.

"},{"location":"tutorials/cpp/#building-a-header-only-library","title":"Building a Header-only Library","text":"

To build a package for the header-only library xtensor, you need to manage dependencies and ensure proper installation paths.

"},{"location":"tutorials/cpp/#key-steps","title":"Key Steps","text":"
  1. Dependencies: Ensure cmake, ninja, and a compiler are available as dependencies.

  2. CMake Installation Prefix: Use the CMAKE_INSTALL_PREFIX setting to instruct CMake to install the headers in the correct location.

  3. Unix Systems: Follow the standard Unix prefix:

    $PREFIX/include\n$PREFIX/lib\n

  4. Windows Systems: Use a Unix-like prefix but nested in a Library directory:

    $PREFIX/Library/include\n$PREFIX/Library/lib\n
    Utilize the handy variables %LIBRARY_PREFIX% and %LIBRARY_BIN% to guide CMake to install the headers and libraries correctly.

This approach ensures that the headers and libraries are installed in the correct directories on both Unix and Windows systems.

"},{"location":"tutorials/cpp/#recipe","title":"Recipe","text":"recipe.yaml
context:\n  version: \"0.24.6\"\n\npackage:\n  name: xtensor\n  version: ${{ version }}\n\nsource:\n  url: https://github.com/xtensor-stack/xtensor/archive/${{ version }}.tar.gz\n  sha256: f87259b51aabafdd1183947747edfff4cff75d55375334f2e81cee6dc68ef655\n\nbuild:\n  number: 0\n  script:\n    - if: win # (1)!\n      then: |\n        cmake -GNinja \\\n            -D BUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% \\\n            %SRC_DIR%\n        ninja install\n      else: |\n        cmake ${CMAKE_ARGS} -GNinja -DBUILD_TESTS=OFF \\\n              -DCMAKE_INSTALL_PREFIX=$PREFIX \\\n              $SRC_DIR\n        ninja install\n\nrequirements:\n  build:\n    - ${{ compiler('cxx') }} # (2)!\n    - cmake\n    - ninja\n  host:\n    - xtl >=0.7,<0.8\n  run:\n    - xtl >=0.7,<0.8\n  run_constraints: # (3)!\n    - xsimd >=8.0.3,<10 \n\ntests:\n  - package_contents:\n      include: # (4)!\n        - xtensor/xarray.hpp\n      files: # (5)!\n        - share/cmake/xtensor/xtensorConfig.cmake\n        - share/cmake/xtensor/xtensorConfigVersion.cmake\n\nabout:\n  homepage: https://github.com/xtensor-stack/xtensor\n  license: BSD-3-Clause\n  license_file: LICENSE\n  summary: The C++ tensor algebra library\n  description: Multi dimensional arrays with broadcasting and lazy computing\n  documentation: https://xtensor.readthedocs.io\n  repository: https://github.com/xtensor-stack/xtensor\n\nextra:\n  recipe-maintainers:\n    - some-maintainer\n
  1. The if: condition allows the user to switch behavior of the build based on some checks like, the operating system.
  2. The compiler function is used to get the C++ compiler for the build system.
  3. The run_constraints section specifies the version range of a package which the package can run \"with\". But which the package doesn't depend on itself.
  4. The include section specifies the header file to tested for existence.
  5. The files section specifies the files to be tested for existence, using a glob pattern.
"},{"location":"tutorials/cpp/#building-a-c-application","title":"Building A C++ application","text":"

In this example, we'll build poppler, a C++ application for manipulating PDF files from the command line. The final package will install several tools into the bin/ folder. We'll use external build scripts and run actual scripts in the test.

"},{"location":"tutorials/cpp/#key-steps_1","title":"Key Steps","text":"
  1. Dependencies:

    • Build Dependencies: These are necessary for the building process, including cmake, ninja, and pkg-config.
    • Host Dependencies: These are the libraries poppler links against, such as cairo, fontconfig, freetype, glib, and others.
  2. Compiler Setup: We use the compiler function to obtain the appropriate C and C++ compilers.

  3. Build Script: The build.script field points to an external script (poppler-build.sh) which contains the build commands.

  4. Testing: Simple tests are included to verify that the installed tools (pdfinfo, pdfunite, pdftocairo) are working correctly by running them, and expecting an exit code 0.

"},{"location":"tutorials/cpp/#recipe_1","title":"Recipe","text":"recipe.yaml
context:\n  version: \"24.01.0\"\n\npackage:\n  name: poppler\n  version: ${{ version }}\n\nsource:\n  url: https://poppler.freedesktop.org/poppler-${{ version }}.tar.xz\n  sha256: c7def693a7a492830f49d497a80cc6b9c85cb57b15e9be2d2d615153b79cae08\n\nbuild:\n  script: poppler-build.sh\n\nrequirements:\n  build:\n    - ${{ compiler('c') }} # (1)!\n    - ${{ compiler('cxx') }}\n    - pkg-config\n    - cmake\n    - ninja\n  host:\n    - cairo # (2)!\n    - fontconfig\n    - freetype\n    - glib\n    - libboost-headers\n    - libjpeg-turbo\n    - lcms2\n    - libiconv\n    - libpng\n    - libtiff\n    - openjpeg\n    - zlib\n\ntests:\n  - script:\n      - pdfinfo -listenc  # (3)!\n      - pdfunite --help\n      - pdftocairo --help\n
  1. The compiler jinja function to get the correct compiler for C and C++ on the build system.
  2. These are all the dependencies that the library links against.
  3. The script test just executes some of the installed tools to check if they are working. These can be as complex as you want. (bash or cmd.exe)
"},{"location":"tutorials/cpp/#external-build-script","title":"External Build Script","text":"

We've defined an external build script in the recipe. This will be searched next to the recipe by the file name given, or the default name build.sh on unix or build.bat on windows are searched for.

poppler-build.sh
#! /bin/bash\n\nextra_cmake_args=(\n    -GNinja\n    -DCMAKE_INSTALL_LIBDIR=lib\n    -DENABLE_UNSTABLE_API_ABI_HEADERS=ON\n    -DENABLE_GPGME=OFF\n    -DENABLE_LIBCURL=OFF\n    -DENABLE_LIBOPENJPEG=openjpeg2\n    -DENABLE_QT6=OFF\n    -DENABLE_QT5=OFF\n    -DENABLE_NSS3=OFF\n)\n\nmkdir build && cd build\n\ncmake ${CMAKE_ARGS} \"${extra_cmake_args[@]}\" \\\n    -DCMAKE_PREFIX_PATH=$PREFIX \\\n    -DCMAKE_INSTALL_PREFIX=$PREFIX \\\n    -DTIFF_INCLUDE_DIR=$PREFIX/include \\\n    $SRC_DIR\n\nninja\n\n# The `install` command will take care of copying the files to the right place\nninja install\n
"},{"location":"tutorials/cpp/#parsing-the-rattler-build-build-output","title":"Parsing the rattler-build build Output","text":"

When running the rattler-build command, you might notice some interesting information in the output. Our package will have some run dependencies, even if we didn't specify any.

These come from the run-exports of the packages listed in the host section of the recipe. This is indicated by \"RE of [host: package]\" in the output.

For example, libcurl specifies that if you depend on it in the host section, you should also depend on it during runtime with specific version ranges. This ensures proper linking to shared libraries.

Run dependencies:\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Name                  \u2506 Spec                                         \u2502\n\u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2561\n\u2502 libcurl               \u2506 >=8.5.0,<9.0a0 (RE of [host: libcurl])       \u2502\n\u2502 fontconfig            \u2506 >=2.14.2,<3.0a0 (RE of [host: fontconfig])   \u2502\n\u2502 fonts-conda-ecosystem \u2506 (RE of [host: fontconfig])                   \u2502\n\u2502 lcms2                 \u2506 >=2.16,<3.0a0 (RE of [host: lcms2])          \u2502\n\u2502 gettext               \u2506 >=0.21.1,<1.0a0 (RE of [host: gettext])      \u2502\n\u2502 freetype              \u2506 >=2.12.1,<3.0a0 (RE of [host: freetype])     \u2502\n\u2502 openjpeg              \u2506 >=2.5.0,<3.0a0 (RE of [host: openjpeg])      \u2502\n\u2502 libiconv              \u2506 >=1.17,<2.0a0 (RE of [host: libiconv])       \u2502\n\u2502 cairo                 \u2506 >=1.18.0,<2.0a0 (RE of [host: cairo])        \u2502\n\u2502 libpng                \u2506 >=1.6.42,<1.7.0a0 (RE of [host: libpng])     \u2502\n\u2502 libzlib               \u2506 >=1.2.13,<1.3.0a0 (RE of [host: zlib])       \u2502\n\u2502 libtiff               \u2506 >=4.6.0,<4.7.0a0 (RE of [host: libtiff])     \u2502\n\u2502 libjpeg-turbo         \u2506 >=3.0.0,<4.0a0 (RE of [host: libjpeg-turbo]) \u2502\n\u2502 libglib               \u2506 >=2.78.3,<3.0a0 (RE of [host: glib])         \u2502\n\u2502 libcxx                \u2506 >=16 (RE of [build: clangxx_osx-arm64])      \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n

You can also see \"linking\" information in the output, for example on macOS:

[lib/libpoppler-glib.8.26.0.dylib] links against:\n \u251c\u2500 @rpath/libgio-2.0.0.dylib\n \u251c\u2500 @rpath/libgobject-2.0.0.dylib\n \u251c\u2500 /usr/lib/libSystem.B.dylib\n \u251c\u2500 @rpath/libglib-2.0.0.dylib\n \u251c\u2500 @rpath/libpoppler.133.dylib\n \u251c\u2500 @rpath/libfreetype.6.dylib\n \u251c\u2500 @rpath/libc++.1.dylib\n \u251c\u2500 @rpath/libpoppler-glib.8.dylib\n \u2514\u2500 @rpath/libcairo.2.dylib\n

rattler-build ensures that:

  1. All shared libraries linked against are present in the run dependencies. Missing libraries trigger an overlinking warning.
  2. You don't require any packages in the host that you are not linking against. This triggers an overdepending warning.
"},{"location":"tutorials/python/","title":"Writing a Python package","text":"

Writing a Python package is fairly straightforward, especially for \"Python-only\" packages. In the second example we will build a package for numpy which contains compiled code.

"},{"location":"tutorials/python/#a-python-only-package","title":"A Python-only package","text":"

The following recipe uses the noarch: python setting to build a noarch package that can be installed on any platform without modification. This is very handy for packages that are pure Python and do not contain any compiled extensions.

Additionally, noarch: python packages work with a range of Python versions (contrary to packages with compiled extensions that are tied to a specific Python version).

recipe.yaml
context:\n  version: \"8.1.2\"\n\npackage:\n  name: ipywidgets\n  version: ${{ version }}\n\nsource:\n  url: https://pypi.io/packages/source/i/ipywidgets/ipywidgets-${{ version }}.tar.gz\n  sha256: d0b9b41e49bae926a866e613a39b0f0097745d2b9f1f3dd406641b4a57ec42c9\n\nbuild:\n  noarch: python # (1)!\n  script: pip install . -v\n\nrequirements:\n  # note that there is no build section\n  host:\n    - pip\n    - python >=3.7\n    - setuptools\n    - wheel\n  run:\n    - comm >=0.1.3\n    - ipython >=6.1.0\n    - jupyterlab_widgets >=3.0.10,<3.1.0\n    - python >=3.7\n    - traitlets >=4.3.1\n    - widgetsnbextension >=4.0.10,<4.1.0\n\ntests:\n  - python:\n      imports:\n        - ipywidgets # (2)!\n\nabout:\n  homepage: https://github.com/ipython/ipywidgets\n  license: BSD-3-Clause\n  license_file: LICENSE\n  summary: Jupyter Interactive Widgets\n  description: |\n    ipywidgets are interactive HTML widgets for Jupyter notebooks and the IPython kernel.\n  documentation: https://ipywidgets.readthedocs.io/en/latest/\n
  1. The noarch: python line tells rattler-build that this package is pure Python and can be one-size-fits-all. noarch packages can be installed on any platform without modification which is very handy.
  2. The imports section in the tests is used to check that the package is installed correctly and can be imported.
"},{"location":"tutorials/python/#running-the-recipe","title":"Running the recipe","text":"

To build this recipe, simply run:

rattler-build build --recipe ./ipywidgets\n
"},{"location":"tutorials/python/#a-python-package-with-compiled-extensions","title":"A Python package with compiled extensions","text":"

We will build a package for numpy \u2013 which contains compiled code. Since compiled code is python version-specific, we will need to specify the python version explicitly. The best way to do this is with a \"variant_config.yaml\" file:

variant_config.yaml
python:\n  - 3.11\n  - 3.12\n

This will replace any python found in the recipe with the versions specified in the variant_config.yaml file.

recipe.yaml
context:\n  version: 1.26.4\n\npackage:\n  name: numpy\n  version: ${{ version }}\n\nsource:\n  - url: https://github.com/numpy/numpy/releases/download/v${{ version }}/numpy-${{ version }}.tar.gz\n    sha256: 2a02aba9ed12e4ac4eb3ea9421c420301a0c6460d9830d74a9df87efa4912010\n\nbuild:\n  python:\n    entry_points:\n      - f2py = numpy.f2py.f2py2e:main  # [win]\n\nrequirements:\n  build:\n    - ${{ compiler('c') }}\n    - ${{ compiler('cxx') }}\n  host:\n    # note: variant is injected here!\n    - python\n    - pip\n    - meson-python\n    - ninja\n    - pkg-config\n    - python-build\n    - cython\n    - libblas\n    - libcblas\n    - liblapack\n  run:\n    - python\n  run_exports:\n    - ${{ pin_subpackage(\"numpy\") }}\n\ntests:\n  - python:\n      imports:\n        - numpy\n        - numpy.array_api\n        - numpy.array_api.linalg\n        - numpy.ctypeslib\n\n  - script:\n    - f2py -h\n\nabout:\n  homepage: http://numpy.org/\n  license: BSD-3-Clause\n  license_file: LICENSE.txt\n  summary: The fundamental package for scientific computing with Python.\n  documentation: https://numpy.org/doc/stable/\n  repository: https://github.com/numpy/numpy\n

The build script for Unix:

build.sh
mkdir builddir\n\n$PYTHON -m build -w -n -x \\\n    -Cbuilddir=builddir \\\n    -Csetup-args=-Dblas=blas \\\n    -Csetup-args=-Dlapack=lapack\n\n$PYTHON -m pip install dist/numpy*.whl\n

The build script for Windows:

build.bat
mkdir builddir\n\n%PYTHON% -m build -w -n -x ^\n    -Cbuilddir=builddir ^\n    -Csetup-args=-Dblas=blas ^\n    -Csetup-args=-Dlapack=lapack\nif %ERRORLEVEL% neq 0 exit 1\n\n:: `pip install dist\\numpy*.whl` does not work on windows,\n:: so use a loop; there's only one wheel in dist/ anyway\nfor /f %%f in ('dir /b /S .\\dist') do (\n    pip install %%f\n    if %ERRORLEVEL% neq 0 exit 1\n)\n
"},{"location":"tutorials/python/#running-the-recipe_1","title":"Running the recipe","text":"

Running this recipe with the variant config file will build a a total of 2 numpy packages:

rattler-build build --recipe ./numpy \\\n  --variant-config ./numpy/variant_config.yaml\n

At the beginning of the build process, rattler-build will print the following message to show you the variants it found:

Found variants:\n\nnumpy-1.26.4-py311h5f8ada8_0\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Variant         \u2506 Version   \u2502\n\u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2561\n\u2502 python          \u2506 3.11      \u2502\n\u2502 target_platform \u2506 osx-arm64 \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\nnumpy-1.26.4-py312h440f24a_0\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Variant         \u2506 Version   \u2502\n\u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2561\n\u2502 python          \u2506 3.12      \u2502\n\u2502 target_platform \u2506 osx-arm64 \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n
"},{"location":"tutorials/rust/","title":"Building a Rust package","text":"

We're using rattler-build to build a Rust package for the cargo-edit utility. This utility manages Cargo dependencies from the command line.

To configure the Rust compiler, we add a variant_config.yaml file to the package:

variant_config.yaml
rust_compiler: rust\n

This will tell rattler-build what to insert for the ${{ compiler('rust') }} Jinja function.

Note

The ${{ compiler(...) }} functions are very useful in the context of cross-compilation. When the function is evaluated it will insert the correct compiler (as selected with the variant config) as well the target_platform. The \"rendered\" compiler will look like rust_linux-64 when you are targeting the linux-64 platform.

You can read more about this in the cross-compilation section.

recipe.yaml
context:\n  version: \"0.11.9\"\n\npackage:\n  name: cargo-edit\n  version: ${{ version }}\n\nsource:\n  url: https://github.com/killercup/cargo-edit/archive/refs/tags/v${{ version }}.tar.gz\n  sha256: 46670295e2323fc2f826750cdcfb2692fbdbea87122fe530a07c50c8dba1d3d7\n\nbuild:\n  script:\n    - cargo-bundle-licenses --format yaml --output ${SRC_DIR}/THIRDPARTY.yml  # !(1)\n    - $BUILD_PREFIX/bin/cargo install --locked --bins --root ${PREFIX} --path .\n\nrequirements:\n  build:\n    - ${{ compiler('rust') }}\n    - cargo-bundle-licenses\n\ntests:\n  - script:\n      - cargo-upgrade --help # !(2)\n\nabout:\n  homepage: https://github.com/killercup/cargo-edit\n  license: MIT\n  license_file:\n    - LICENSE\n    - THIRDPARTY.yml\n  description: \"A utility for managing cargo dependencies from the command line.\"\n  summary: \"A utility for managing cargo dependencies from the command line.\"\n
  1. The cargo-bundle-licenses utility is used to bundle all the licenses of the dependencies into a THIRDPARTY.yml file. This file is then included in the package. You should always include this file in your package when you are redistributing it.
  2. Running scripts in bash or cmd.exe to test the package build well, expects an exit code of 0 to pass the test.

To build this recipe, simply run:

rattler-build build \\\n    --recipe ./cargo-edit/recipe.yaml \\\n    --variant-config ./cargo-edit/variant_config.yaml\n
"}]} \ No newline at end of file +{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Home","text":""},{"location":"#rattler-build-a-fast-conda-package-builder","title":"rattler-build: A Fast Conda Package Builder","text":"

The rattler-build tooling and library creates cross-platform relocatable binaries / packages from a simple recipe format. The recipe format is heavily inspired by conda-build and boa, and the output of a regular rattler-build run is a package that can be installed using mamba, rattler or conda.

rattler-build does not have any dependencies on conda-build or Python and works as a standalone binary.

"},{"location":"#installation","title":"Installation","text":"

You can grab a prerelease version of rattler-build from the Github Releases.

It is (of course) also available from conda-forge:

pixi global install rattler-build\n# or with micromamba\nmicromamba install rattler-build -c conda-forge\n

Alternatively, you can install rattler-build via Homebrew:

brew install rattler-build\n

rattler-build is also available on Arch Linux in the extra repository:

pacman -S rattler-build\n
"},{"location":"#dependencies","title":"Dependencies","text":"

Currently rattler-build needs some dependencies on the host system which are executed as subprocess. We plan to reduce the number of external dependencies over time by writing what we need in Rust to make rattler-build fully self-contained.

On Windows, to obtain these dependencies from conda-forge, one can install m2-patch, m2-bzip2, m2-gzip, m2-tar.

"},{"location":"#documentation","title":"Documentation","text":"

We have extensive documentation for rattler-build. You can find the book here.

"},{"location":"#github-action","title":"GitHub Action","text":"

There is a GitHub Action for rattler-build. It can be used to install rattler-build in CI/CD workflows and run a build command. Please check out the GitHub Action documentation for more information.

"},{"location":"#usage","title":"Usage","text":"

rattler-build comes with two commands: build and test.

The build command takes a --recipe recipe.yaml as input and produces a package as output. The test subcommand can be used to test existing packages (tests are shipped with the package).

"},{"location":"#the-recipe-format","title":"The recipe format","text":"

Note You can find all examples below in the examples folder in the codebase and run them with rattler-build.

A simple example recipe for the xtensor header-only C++ library:

# yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\n\ncontext:\n  name: xtensor\n  version: 0.24.6\n\npackage:\n  name: ${{ name|lower }}\n  version: ${{ version }}\n\nsource:\n  url: https://github.com/xtensor-stack/xtensor/archive/${{ version }}.tar.gz\n  sha256: f87259b51aabafdd1183947747edfff4cff75d55375334f2e81cee6dc68ef655\n\nbuild:\n  number: 0\n  script:\n    - if: win\n      then: |\n        cmake -G \"NMake Makefiles\" -D BUILD_TESTS=OFF -D CMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% %SRC_DIR%\n        nmake\n        nmake install\n      else: |\n        cmake ${CMAKE_ARGS} -DBUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=$PREFIX $SRC_DIR -DCMAKE_INSTALL_LIBDIR=lib\n        make install\n\nrequirements:\n  build:\n    - ${{ compiler('cxx') }}\n    - cmake\n    - if: unix\n      then: make\n  host:\n    - xtl >=0.7,<0.8\n  run:\n    - xtl >=0.7,<0.8\n  run_constraints:\n    - xsimd >=8.0.3,<10\n\ntests:\n  - script:\n    - if: unix or emscripten\n      then:\n        - test -d ${PREFIX}/include/xtensor\n        - test -f ${PREFIX}/include/xtensor/xarray.hpp\n        - test -f ${PREFIX}/share/cmake/xtensor/xtensorConfig.cmake\n        - test -f ${PREFIX}/share/cmake/xtensor/xtensorConfigVersion.cmake\n    - if: win\n      then:\n        - if not exist %LIBRARY_PREFIX%\\include\\xtensor\\xarray.hpp (exit 1)\n        - if not exist %LIBRARY_PREFIX%\\share\\cmake\\xtensor\\xtensorConfig.cmake (exit 1)\n        - if not exist %LIBRARY_PREFIX%\\share\\cmake\\xtensor\\xtensorConfigVersion.cmake (exit 1)\n\nabout:\n  homepage: https://github.com/xtensor-stack/xtensor\n  license: BSD-3-Clause\n  license_file: LICENSE\n  summary: The C++ tensor algebra library\n  description: Multi dimensional arrays with broadcasting and lazy computing\n  documentation: https://xtensor.readthedocs.io\n  repository: https://github.com/xtensor-stack/xtensor\n\nextra:\n  recipe-maintainers:\n    - some-maintainer\n

A recipe for the rich Python package (using noarch):

context:\n  version: \"13.4.2\"\n\npackage:\n  name: \"rich\"\n  version: ${{ version }}\n\nsource:\n  - url: https://pypi.io/packages/source/r/rich/rich-${{ version }}.tar.gz\n    sha256: d653d6bccede5844304c605d5aac802c7cf9621efd700b46c7ec2b51ea914898\n\nbuild:\n  # Thanks to `noarch: python` this package works on all platforms\n  noarch: python\n  script:\n    - python -m pip install . -vv --no-deps --no-build-isolation\n\nrequirements:\n  host:\n    - pip\n    - poetry-core >=1.0.0\n    - python 3.10\n  run:\n    # sync with normalized deps from poetry-generated setup.py\n    - markdown-it-py >=2.2.0\n    - pygments >=2.13.0,<3.0.0\n    - python 3.10\n    - typing_extensions >=4.0.0,<5.0.0\n\ntests:\n  - python:\n      imports:\n        - rich\n      pip_check: true\n\nabout:\n  homepage: https://github.com/Textualize/rich\n  license: MIT\n  license_file: LICENSE\n  summary: Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal\n  description: |\n    Rich is a Python library for rich text and beautiful formatting in the terminal.\n\n    The Rich API makes it easy to add color and style to terminal output. Rich\n    can also render pretty tables, progress bars, markdown, syntax highlighted\n    source code, tracebacks, and more \u2014 out of the box.\n  documentation: https://rich.readthedocs.io\n  repository: https://github.com/Textualize/rich\n

A recipe for the curl library:

context:\n  version: \"8.0.1\"\n\npackage:\n  name: curl\n  version: ${{ version }}\n\nsource:\n  url: http://curl.haxx.se/download/curl-${{ version }}.tar.bz2\n  sha256: 9b6b1e96b748d04b968786b6bdf407aa5c75ab53a3d37c1c8c81cdb736555ccf\n\nbuild:\n  number: 0\n\nrequirements:\n  build:\n    - ${{ compiler('c') }}\n    - if: win\n      then:\n        - cmake\n        - ninja\n    - if: unix\n      then:\n        - make\n        - perl\n        - pkg-config\n        - libtool\n  host:\n    - if: linux\n      then:\n        - openssl\n\nabout:\n  homepage: http://curl.haxx.se/\n  license: MIT/X derivate (http://curl.haxx.se/docs/copyright.html)\n  license_file: COPYING\n  summary: tool and library for transferring data with URL syntax\n  description: |\n    Curl is an open source command line tool and library for transferring data\n    with URL syntax. It is used in command lines or scripts to transfer data.\n  documentation: https://curl.haxx.se/docs/\n  repository: https://github.com/curl/curl\n

For the curl library recipe, two additional script files (build.sh and build.bat) are needed.

build.sh

#!/bin/bash\n\n# Get an updated config.sub and config.guess\ncp $BUILD_PREFIX/share/libtool/build-aux/config.* .\n\nif [[ $target_platform =~ linux.* ]]; then\n    USESSL=\"--with-openssl=${PREFIX}\"\nelse\n    USESSL=\"--with-secure-transport\"\nfi;\n\n./configure \\\n    --prefix=${PREFIX} \\\n    --host=${HOST} \\\n    ${USESSL} \\\n    --with-ca-bundle=${PREFIX}/ssl/cacert.pem \\\n    --disable-static --enable-shared\n\nmake -j${CPU_COUNT} ${VERBOSE_AT}\nmake install\n\n# Includes man pages and other miscellaneous.\nrm -rf \"${PREFIX}/share\"\n

build.bat

mkdir build\n\ncmake -GNinja ^\n      -DCMAKE_BUILD_TYPE=Release ^\n      -DBUILD_SHARED_LIBS=ON ^\n      -DCMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% ^\n      -DCMAKE_PREFIX_PATH=%LIBRARY_PREFIX% ^\n      -DCURL_USE_SCHANNEL=ON ^\n      -DCURL_USE_LIBSSH2=OFF ^\n      -DUSE_ZLIB=ON ^\n      -DENABLE_UNICODE=ON ^\n      %SRC_DIR%\n\nIF %ERRORLEVEL% NEQ 0 exit 1\n\nninja install --verbose\n
"},{"location":"authentication_and_upload/","title":"Server authentication","text":""},{"location":"authentication_and_upload/#authenticating-with-a-server","title":"Authenticating with a server","text":"

You may want to use private channels for which you need to be authenticated. To do this ephemerally you can use the RATTLER_AUTH_FILE environment variable to point to a JSON file with the following structure:

{\n    \"*.prefix.dev\": {\n        \"BearerToken\": \"your_token\"\n    },\n    \"otherhost.com\": {\n        \"BasicHttp\": {\n            \"username\": \"your_username\",\n            \"password\": \"your_password\"\n        }\n    },\n    \"anaconda.org\": {\n        \"CondaToken\": \"your_token\"\n    }\n}\n

The keys are the host names. You can use wildcard specifiers here (e.g. *.prefix.dev to match all subdomains of prefix.dev, such as repo.prefix.dev). This will allow you to also obtain packages from any private channels that you have access to.

The following known authentication methods are supported:

"},{"location":"authentication_and_upload/#uploading-packages","title":"Uploading packages","text":"

If you want to upload packages, then rattler-build comes with a built-in upload command. There are 4 options:

The command is:

rattler-build upload <server> <package_files>\n

Note: you can also use the RATTLER_AUTH_FILE environment variable to authenticate with the server.

"},{"location":"authentication_and_upload/#prefixdev","title":"prefix.dev","text":"

To upload to prefix.dev, you need to have an account and a token. You can create a token in the settings of your account. The token is used to authenticate the upload.

export PREFIX_API_KEY=<your_token>\nrattler-build upload prefix -c <channel> <package_files>\n

You can also use the --api-key=$PREFIX_API_KEY option to pass the token directly to the command. Note that you need to have created the channel on the prefix.dev website before you can upload to it.

"},{"location":"authentication_and_upload/#quetz","title":"Quetz","text":"

You need to pass a token and API key to upload to a channel on your own Quetz server. The token is used to authenticate the upload.

export QUETZ_API_KEY=<your_token>\nrattler-build upload quetz -u <url> -c <channel> <package_files>\n
"},{"location":"authentication_and_upload/#artifactory","title":"Artifactory","text":"

To upload to an Artifactory server, you need to pass a username and password. The username and password are used to authenticate the upload.

export ARTIFACTORY_USERNAME=<your_username>\nexport ARTIFACTORY_PASSWORD=<your_password>\nrattler-build upload artifactory -u <url> -c <channel> <package_files>\n
"},{"location":"authentication_and_upload/#anacondaorg","title":"anaconda.org","text":"

To upload to anaconda.org, you need to specify the owner and API key. The API key is used to authenticate the upload.

The owner is the owner of the distribution, for example, your user name or organization.

One can also specify a label such as dev for release candidates using the -c flag. The default value is main.

You can also add the --force argument to forcibly upload a new package (and overwrite any existing ones).

export ANACONDA_API_KEY=<your_token>\nrattler-build upload anaconda -o <your_username> -c <label> <package_files>\n
"},{"location":"automatic_linting/","title":"Enabling Automatic Linting in VSCode","text":"

Our new recipe format adheres to a strict JSON schema, which you can access here.

This schema is implemented using pydantic and can be rendered into a JSON schema file. The YAML language server extension in VSCode is capable of recognizing this schema, providing useful hints during the editing process.

To enable automatic linting with the YAML language server, you need to add the following line at the beginning of your recipe file:

# yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\n

Alternatively, if you prefer not to add this line to your file, you can install the JSON Schema Store Catalog extension. This extension will also enable automatic linting for your recipe files.

"},{"location":"build_options/","title":"Advanced build options","text":"

There are some specialized build options to control various features:

These are all found under the build key in the recipe.yaml.

"},{"location":"build_options/#always-include-and-always-copy-files","title":"Always include and always copy files","text":"

There are some options that control the inclusion of files in the final package.

The always_include_files option can be used to include files even if they are already in the environment as part of some other host dependency. This is normally \"clobbering\" and should be used with caution (since packages should not have any overlapping files).

The always_copy_files option can be used to copy files instead of linking them. This is useful for files that might be modified inside the environment (e.g. configuration files). Normally, files are linked from a central cache into the environment to save space \u2013 that means that files modified in one environment will be modified in all environments. This is not always desirable, and in that case you can use the always_copy_files option.

How always_copy_files works

The always_copy_files option works by setting the no_link option in the info/paths.json to true for the files in question. This means that the files are copied instead of linked when the package is installed.

recipe.yaml
build:\n  # include files even if they are already in the environment\n  # as part of some other host dependency\n  always_include_files: list of globs\n\n  # do not soft- or hard-link these files, but always copy them was `no_link`\n  always_copy_files: list of globs\n

Glob patterns

Glob patterns are used througout the build options to specify files. The patterns are matched against the relative path of the file in the build directory. Patterns can contain * to match any number of characters, ? to match a single character, and ** to match any number of directories.

For example:

"},{"location":"build_options/#merge-build-and-host-environments","title":"Merge build and host environments","text":"

In very rare cases you might want to merge the build and host environments to obtain the \"legacy\" behavior of conda-build.

recipe.yaml
build:\n  # merge the build and host environments (used in many R packages on Windows)\n  merge_build_and_host_envs: bool (defaults to false)\n
"},{"location":"build_options/#prefix-detection-replacement-options","title":"Prefix detection / replacement options","text":"

During installation time the \"install\"-prefix is injected into text and binary files. Sometimes this is not desired, and sometimes the user might want closer control over the automatic text/binary detection.

The main difference between prefix replacement for text and binary files is that for binary files, the prefix string is padded with null bytes to match the length of the original prefix. The original prefix is the very long placeholder string that you might have seen in the build process.

On Windows, binary prefix replacement is never performed.

recipe.yaml
package:\n  name: mypackage\n  version: 1.0\n\nbuild:\n  # settings concerning the prefix detection in files\n  prefix_detection:\n    # force the file type of the given files to be TEXT or BINARY\n    # for prefix replacement\n    force_file_type:\n      # force TEXT file type (list of globs)\n      text: list of globs\n      # force binary file type (list of globs)\n      binary: list of globs\n\n    # ignore all or specific files for prefix replacement`\n    ignore: bool | [path] (defaults to false)\n\n    # wether to detect binary files with prefix or not\n    # defaults to true on Unix and (always) false on Windows\n    ignore_binary_files: bool\n
"},{"location":"build_options/#variant-configuration","title":"Variant configuration","text":"

To control the variant precisely you can use the \"variant configuration\" options.

A variant package has the same version number, but different \"hash\" and potentially different dependencies or build options. Variant keys are extracted from the variant_config.yaml file and usually any used Jinja variables or dependencies without version specifier are used as variant keys.

Variant keys can also be forcibly set or ignored with the use_keys and ignore_keys options.

In order to decide which of the variant packages to prefer and install by default, the down_prioritize_variant option can be used. The higher the value, the less preferred the variant is.

More about variants can be found in the variant documentation.

The following options are available in the build section to control the variant configuration:

recipe.yaml
build:\n  # settings for the variant\n  variant:\n    # Keys to forcibly use for the variant computation\n    # even if they are not in the dependencies\n    use_keys: list of strings\n\n    # Keys to forcibly ignore for the variant computation\n    # even if they are in the dependencies\n    ignore_keys: list of strings\n\n    # used to prefer this variant less\n    down_prioritize_variant: integer (defaults to 0, higher is less preferred)\n
"},{"location":"build_options/#dynamic-linking-configuration","title":"Dynamic linking configuration","text":"

After the package is built, rattler-build performs some \"post-processing\" on the binaries and libraries.

This entails making the shared libraries relocatable and checking that all linked libraries are present in the run requirements. The following settings control this behavior.

With the rpath option you can forcibly set the rpath of the shared libraries. The path is relative to the install prefix. Any rpath setting is ignored on Windows.

The rpath_allowlist option can be used to allow the rpath to point to locations outside of the environment. This is useful if you want to link against libraries that are not part of the conda environment (e.g. proprietary software).

If you want to stop rattler-build from relocating the binaries, you can set binary_relocation to false. If you want to only relocate some binaries, you can select the relevant ones with a glob pattern.

To read more about rpaths and how rattler-build creates relocatable binary packages, see the internals docs.

If you link against some libraries (possibly even outside of the prefix, in a system location), then you can use the missing_dso_allowlist to allow linking against these and suppress any warnings. This list is pre-populated with a list of known system libraries on the different operating systems.

As part of the post-processing, rattler-build checks for overlinking and overdepending. \"Overlinking\" is when a binary links against a library that is not specified in the run requirements. This is usually a mistake because the library would not be present in the environment when the package is installed.

Conversely, \"overdepending\" is when a library is part of the run requirements, but is not actually used by any of the binaries/libraries in the package.

recipe.yaml
build:\n  # settings for shared libraries and executables\n  dynamic_linking:\n    # linux only, list of rpaths relative to the installation prefix\n    rpaths: list of paths (defaults to ['lib/'])\n\n    # Allow runpath / rpath to point to these locations\n    # outside of the environment\n    rpath_allowlist: list of globs\n\n    # wether to relocate binaries or not. If this is a list of paths, then\n    # only the listed paths are relocated\n    binary_relocation: bool (defaults to true) | list of globs\n\n    # Allow linking against libraries that are not in the run requirements\n    missing_dso_allowlist: list of globs\n\n    # what to do when detecting overdepending\n    overdepending_behavior: \"ignore\" or \"error\" # (defaults to \"error\")\n\n    # what to do when detecting overlinking\n    overlinking_behavior: \"ignore\" or \"error\" # (defaults to \"error\")\n
"},{"location":"build_script/","title":"Build scripts","text":"

The build.sh file is the build script for Linux and macOS and build.bat is the build script for Windows. These scripts contain the logic that carries out your build steps. Anything that your build script copies into the $PREFIX or %PREFIX% folder will be included in your output package.

For example, this build.sh:

mkdir -p $PREFIX/bin\ncp $RECIPE_DIR/my_script_with_recipe.sh $PREFIX/bin/super-cool-script.sh\n

There are many environment variables defined for you to use in build.sh and build.bat. Please see environment variables for more information.

build.sh and build.bat are optional. You can instead use the build/script key in your recipe.yaml, with each value being either a string command or a list of string commands. Any commands you put there must be able to run on every platform for which you build. For example, you can't use the cp command because cmd.exe won't understand it on Windows.

build.sh is run with bash and build.bat is run with cmd.exe.

"},{"location":"build_script/#environment-variables","title":"Environment variables","text":""},{"location":"build_script/#environment-variables-set-during-the-build-process","title":"Environment variables set during the build process","text":"

During the build process, the following environment variables are set, on Windows with build.bat and on macOS and Linux with build.sh. By default, these are the only variables available to your build script. Unless otherwise noted, no variables are inherited from the shell environment in which you invoke conda-build. To override this behavior, see :ref:inherited-env-vars.

ARCH Either 32 or 64, to specify whether the build is 32-bit or 64-bit. The value depends on the ARCH environment variable and defaults to the architecture the interpreter running conda was compiled with. CMAKE_GENERATOR The CMake generator string for the current build environment. On Linux systems, this is always Unix Makefiles. On Windows, it is generated according to the Visual Studio version activated at build time, for example, Visual Studio 9 2008 Win64. CONDA_BUILD=1 Always set to indicate that the conda-build process is running. CPU_COUNT Represents the number of CPUs on the system. SHLIB_EXT Denotes the shared library extension specific to the operating system (e.g. .so for Linux, .dylib for macOS, and .dll for Windows). HTTP_PROXY Inherited from the user's shell environment, specifying the HTTP proxy settings. HTTPS_PROXY Similar to HTTP_PROXY, this is inherited from the user's shell environment and specifies the HTTPS proxy settings. LANG Inherited from the user's shell environment, defining the system language and locale settings. MAKEFLAGS Inherited from the user's shell environment. This can be used to set additional arguments for the make command, such as -j2 to utilize 2 CPU cores for building the recipe. PY_VER Specifies the Python version against which the build is occurring. This can be modified with a variant_config.yaml file. PATH Inherited from the user's shell environment and augmented with the activated host and build prefixes. PREFIX The build prefix to which the build script should install the software. PKG_BUILDNUM Indicates the build number of the package currently being built. PKG_NAME The name of the package that is being built. PKG_VERSION The version of the package currently under construction. PKG_BUILD_STRING The complete build string of the package being built, including the hash (e.g. py311h21422ab_0). PKG_HASH Represents the hash of the package being built, excluding the leading 'h' (e.g. 21422ab). This is applicable from conda-build 3.0 onwards. PYTHON The path to the Python executable in the host prefix. Python is installed in the host prefix only when it is listed as a host requirement. R The path to the R executable in the build prefix. R is installed in the build prefix only when it is listed as a build requirement. RECIPE_DIR The directory where the recipe is located. SP_DIR The location of Python's site-packages, where Python libraries are installed. SRC_DIR The path to where the source code is unpacked or cloned. If the source file is not a recognized archive format, this directory contains a copy of the source file. STDLIB_DIR The location of Python's standard library. build_platform Represents the native subdirectory of the conda executable, indicating the platform for which the build is occurring.

Removed from conda-build are: - NPY_VER - PY3K

"},{"location":"build_script/#windows","title":"Windows","text":"

Unix-style packages on Windows are built in a special Library directory under the build prefix. The environment variables listed in the following table are defined only on Windows.

Variable Description LIBRARY_BIN <build prefix>\\Library\\bin. LIBRARY_INC <build prefix>\\Library\\include. LIBRARY_LIB <build prefix>\\Library\\lib. LIBRARY_PREFIX <build prefix>\\Library. SCRIPTS <build prefix>\\Scripts.

Not yet supported in rattler-build:

Additionally, the following variables are forwarded from the environment:

"},{"location":"build_script/#unix","title":"Unix","text":"

The environment variables listed in the following table are defined only on macOS and Linux.

Variable Description HOME Standard $HOME environment variable. PKG_CONFIG_PATH Path to pkgconfig directory, defaults to `$PREFIX/lib/pkgconfig SSL_CERT_FILE Path to SSL_CERT_FILE file. CFLAGS Empty, can be forwarded from env to set additional arguments to C compiler. CXXFLAGS Same as CFLAGS for C++ compiler. LDFLAGS Empty, additional flags to be passed to the linker when linking object files into an executable or shared object."},{"location":"build_script/#macos","title":"macOS","text":"

The environment variables listed in the following table are defined only on macOS.

Variable Description MACOSX_DEPLOYMENT_TARGET Same as the Anaconda Python macOS deployment target. Currently 10.9 for intel 32- and 64bit macOS, and 11.0 for arm64. OSX_ARCH i386 or x86_64 or arm64, depending on the target platform"},{"location":"build_script/#linux","title":"Linux","text":"

The environment variable listed in the following table is defined only on Linux.

Variable Description LD_RUN_PATH Defaults to <build prefix>/lib. QEMU_LD_PREFIX The prefix used by QEMU's user mode emulation for library paths. QEMU_UNAME Set qemu uname release string to 'uname'. DEJAGNU The path to the dejagnu testing framework used by the GCC test suite. DISPLAY The X11 display to use for graphical applications. BUILD Target triple ({build_arch}-conda_{build_distro}-linux-gnu) where build_distro is one of cos6 or cos7, for Centos 6 or 7"},{"location":"cli_usage/","title":"CLI usage","text":""},{"location":"cli_usage/#shell-completions","title":"Shell Completions","text":"

We support shell completions through clap_complete. You can generate them for your shell using the completion command.

You can add the completions to your shell by adding the following to your shell's configuration file:

# For bash (add this to ~/.bashrc)\neval \"$(rattler-build completion --shell=bash)\"\n# For zsh (add this to ~/.zshrc)\neval \"$(rattler-build completion --shell=zsh)\"\n# For fish (add this to ~/.config/fish/config.fish)\nrattler-build completion --shell=fish | source\n

Ensure that wherever you install rattler-build is pointed to by your PATH (for zsh or equivalent in other shells), after which point you can use TAB or any configured completion key of choice.

$ rattler-build <TAB>\nbuild    -- Build a package\nhelp     -- Print this message or the help of the given subcommand(s)\nrebuild  -- Rebuild a package\ntest     -- Test a package\n\n## Package format\n\nYou can specify the package format (either `.tar.bz2` or `.conda`) by using the `--package-format` flag.\nYou can also set the compression level with `:<level>` after the package format. The `<level>` can be `max`, `min`, `default` or a number corresponding to the compression level.\n`.tar.bz2` supports compression levels between `1` and `9` while `.conda` supports compression levels between `-7` and `22`.\nFor `.conda`, you can also set the `--compression-threads` flag to specify the number of threads to use for compression.\n\n```sh\n# default\nrattler-build build --package-format tarbz2 -r recipe/recipe.yaml\n# maximum compression with 10 threads\nrattler-build build --package-format conda:max --compression-threads 10 -r recipe/recipe.yaml\n
"},{"location":"cli_usage/#logs","title":"Logs","text":"

rattler-build knows three different log styles: fancy, plain, and json. You can configure them with the --log-style=<style> flag:

# default\nrattler-build build --log-style fancy -r recipe/recipe.yaml\n
"},{"location":"cli_usage/#github-integration","title":"GitHub integration","text":"

rattler-build also has a GitHub integration. With this integration, warnings are automatically emitted in the GitHub Actions log and a summary is generated and posted to the GitHub Actions summary page.

To make use of this integration, we recommend using our custom GitHub action: rattler-build-action. To manually enable it, you can set the environment variable RATTLER_BUILD_ENABLE_GITHUB_INTEGRATION=true.

"},{"location":"compilers/","title":"Compilers and cross-compilation","text":"

To use a compiler in your project, it's best to use the ${{ compiler('lang') }} template function. The compiler function works by taking a language, determining the configured compiler for that language, and adding some information about the target platform to the selected compiler. To configure a compiler for a specific language, the variant_config.yaml file can be used.

For example, in a recipe that uses a C-compiler, you can use the following code:

requirements:\n  build:\n    - ${{ compiler('c') }}\n

To set the compiler that you want to use, create a variant config that looks like the following:

c_compiler:\n  - gcc\n\n# optionally you can specify a version\nc_compiler_version:\n  - 9.3.0\n

When the template function is evaluated, it will look something like: gcc_linux-64 9.3.0. You can define your own compilers. For example, for Rust you can use ${{ compiler('rust') }} and rust_compiler_{version} in your variant config.

"},{"location":"compilers/#cross-compilation","title":"Cross-compilation","text":"

Cross-compilation is supported by rattler-build and the compiler template function is part of what makes it possible. When you want to cross-compile from linux-64 to linux-aarch64 (i.e. intel to ARM), you can pass --target-platform linux-aarch64 to the rattler-build command. This will cause the compiler template function to select a compiler that is configured for linux-aarch64. The above example would resolve to gcc_linux-aarch64 9.3.0. Provided that the package is available for linux-64 (your build platform), the compilation should succeed.

The distinction between the build and host sections begins to make sense when thinking about cross-compilation. The build environment is resolved to packages that need to run at compilation time. For example, cmake, gcc, and autotools are all tools that need to be executed. Therefore, the build environment resolves to packages for the linux-64 architecture (in our example). On the other hand, the host packages resolve to linux-aarch64 - those are packages that we want to link against.

# packages that need to run at build time (cmake, gcc, autotools, etc.)\n# in the platform that rattler-build is executed on (the build_platform)\nbuild:\n  - cmake\n  - ${{ compiler('c') }}\n# packages that we want to link against in the architecture we are\n# cross-compiling to the target_platform\nhost:\n  - libcurl\n  - openssl\n
"},{"location":"experimental_features/","title":"Experimental features","text":"

Warning

These are experimental features of rattler-build and may change or go away completely.

Currently only the build and rebuild commands support the following experimental features.

To enable them, use the --experimental flag with the command. Or, use the environment variable, RATTLER_BUILD_EXPERIMENTAL=1.

"},{"location":"experimental_features/#jinja-functions","title":"Jinja functions","text":""},{"location":"experimental_features/#load_from_filefile_path","title":"load_from_file(<file_path>)","text":"

The Jinja function load_from_file allows loading from files; specifically, it allows loading from toml, json, and yaml file types to an object to allow it to fetch things directly from the file. It loads all other files as strings.

"},{"location":"experimental_features/#usage","title":"Usage","text":"

load_from_file is useful when there is a project description in a well-defined project file such as Cargo.toml, package.json, pyproject.toml, package.yaml, or stack.yaml. It enables the recipe to be preserved in as simple a state as possible, especially when there is no need to keep the changes in sync; some example use cases for this are with CI/CD infrastructure or when there is a well-defined output format.

Below is an example loading a Cargo.toml inside of the rattler-build GitHub repository:

recipe.yaml
context:\n  name: ${{ load_from_file(\"Cargo.toml\").package.name }}\n  version: ${{ load_from_file(\"Cargo.toml\").package.version }}\n  source_url: ${{ load_from_file(\"Cargo.toml\").package.homepage }}\n  rust_toolchain: ${{ load_from_file(\"rust-toolchains\") }}\n\npackage:\n  name: ${{ name }}\n  version: ${{ version }}\n\nsource:\n  git: ${{ source_url }}\n  tag: ${{ source_tag }}}}\n\nrequirements:\n  build:\n    - rust ==${{ rust_toolchain }}\n\nbuild:\n  script: cargo build --release -p ${{ name }}\n\ntest:\n  - script: cargo test -p ${{ name }}\n  - script: cargo test -p rust-test -- --test-threads=1\n\nabout:\n  home: ${{ source_url }}\n  repository: ${{ source_url }}\n  documentation: ${{ load_from_file(\"Cargo.toml\").package.documentation }}\n  summary: ${{ load_from_file(\"Cargo.toml\").package.description }}\n  license: ${{ load_from_file(\"Cargo.toml\").package.license }}\n
"},{"location":"experimental_features/#git-functions","title":"git functions","text":"

git functions are useful for getting the latest tag and commit hash. These can be used in the context section of the recipe, to fetch version information from a repository.

Examples
# latest tag in the repo\ngit.latest_tag(<git_repo_url>)\n\n# latest tag revision(aka, hash of tag commit) in the repo\ngit.latest_tag_rev(<git_repo_url>)\n\n# latest commit revision(aka, hash of head commit) in the repo\ngit.head_rev(<git_repo_url>)\n
"},{"location":"experimental_features/#usage_1","title":"Usage","text":"

These can be useful for automating minor things inside of the recipe itself, such as if the current version is the latest version or if the current hash is the latest hash, etc.

recipe.yaml
context:\n  git_repo_url: \"https://github.com/prefix-dev/rattler-build\"\n  latest_tag: ${{ git.latest_tag( git_repo_url ) }}\n\npackage:\n  name: \"rattler-build\"\n  version: ${{ latest_tag }}\n\nsource:\n  git: ${{ git_repo_url }}\n  tag: ${{ latest_tag }}\n

There is currently no guarantee of caching for repo fetches when using git functions. This may lead to some performance issues.

"},{"location":"highlevel/","title":"What is rattler-build?","text":"

rattler-build is a tool to build and package software so that it can be installed on any operating system \u2013 with any compatible package manager such as mamba, conda, or rattler. We are also intending for rattler-build to be used as a library to drive builds of packages from any other recipe format in the future.

"},{"location":"highlevel/#how-does-rattler-build-work","title":"How does rattler-build work?","text":"

Building of packages consists of several steps. It all begins with a recipe.yaml file that specifies how the package is to be built and what the dependencies are. From the recipe file, rattler-build executes several steps:

  1. Rendering:

Parse the recipe file and evaluate conditionals, Jinja expressions, and variables, and variants.

  1. Fetch source:

Retrieve specified source files, such as .tar.gz files, git repositories, local paths. Additionally, this step will apply patches that can be specified alongside the source file.

  1. Install build environments:

Download and install dependencies into temporary \"host\" and \"build\" workspaces. Any dependencies that are needed at build time are installed in this step.

  1. Build source:

Execute the build script to build/compile the source code and install it into the host environment.

  1. Prepare package files:

Collect all files that are new in the \"host\" environment and apply some transformations if necessary; specifically, we edit the rpath on Linux and macOS to make binaries relocatable.

  1. Package:

Bundle all the files in a package and write out any additional metadata into the info/index.json, info/about.json, and info/paths.json files. This also creates the test files that are bundled with the package.

  1. Test:

Run any tests specified in the recipe. The package is considered done if it passes all the tests, otherwise its moved to broken/ in the output directory.

After this process, a package is created. This package can be uploaded to somewhere like a custom prefix.dev private or public channel.

"},{"location":"highlevel/#how-to-run-rattler-build","title":"How to run rattler-build","text":"

Running rattler-build is straightforward. It can be done on the command line:

rattler-build build --recipe myrecipe/recipe.yaml\n

A custom channel that is not conda-forge (the default) can be specified like so:

rattler-build build -c robostack --recipe myrecipe/recipe.yaml\n

You can also use the --recipe-dir argument if you want to build all the packages in a directory:

rattler-build build --recipe-dir myrecipes/\n
"},{"location":"highlevel/#overview-of-a-recipeyaml","title":"Overview of a recipe.yaml","text":"

A recipe.yaml file is separated into multiple sections and can conditionally include or exclude sections. Recipe files also support a limited amount of string interpolation with Jinja (specifically minijinja in our case).

A simple example of a recipe file for the zlib package would look as follows:

recipe.yaml
# variables from the context section can be used in the rest of the recipe\n# in jinja expressions\ncontext:\n  version: 1.2.13\n\npackage:\n  name: zlib\n  version: ${{ version }}\n\nsource:\n  url: http://zlib.net/zlib-${{ version }}.tar.gz\n  sha256: b3a24de97a8fdbc835b9833169501030b8977031bcb54b3b3ac13740f846ab30\n\nbuild:\n  # build numbers can be set arbitrarily\n  number: 0\n  script:\n    # build script to install the package into the $PREFIX (host prefix)\n    - if: unix\n      then:\n      - ./configure --prefix=$PREFIX\n      - make -j$CPU_COUNT\n    - if: win\n      then:\n      - cmake -G \"Ninja\" -DCMAKE_BUILD_TYPE=Release -DCMAKE_PREFIX_PATH=%LIBRARY_PREFIX%\n      - ninja install\n\nrequirements:\n  build:\n    # compiler is a special function.\n    - ${{ compiler(\"c\") }}\n    # The following two dependencies are only needed on Windows,\n    # and thus conditionally selected\n    - if: win\n      then:\n        - cmake\n        - ninja\n    - if: unix\n      then:\n        - make\n

The sections of a recipe are:

sections description context Defines variables that can be used in the Jinja context later in the recipe (e.g. name and version are commonly interpolated in strings) package This section defines the name and version of the package you are currently building and will be the name of the final output source Defines where the source code is going to be downloaded from and checksums build Settings for the build and the build script requirements Allows the definition of build, host, run and run-constrained dependencies"},{"location":"internals/","title":"Internals of rattler-build","text":""},{"location":"internals/#making-packages-relocatable-with-rattler-build","title":"Making Packages Relocatable with rattler-build","text":"

Often, the most challenging aspect of building a package using rattler-build is making it relocatable. A relocatable package can be installed into any prefix, allowing it to be used outside the environment in which it was built. This is in contrast to a non-relocatable package, which can only be utilized within its original build environment.

rattler-build automatically performs the following actions to make packages relocatable:

  1. Binary object file conversion: Binary object files are converted to use relative paths using install_name_tool on macOS and patchelf on Linux. This uses $ORIGIN for elf files on Linux and @loader_path for Mach-O files on macOS to make the rpath relative to the executable / shared library.
  2. Text file prefix registration: Any text file without NULL bytes containing the placeholder prefix have the registered prefix replaced with the install prefix.
  3. Binary file prefix detection and registration: Binary files containing the build prefix can be automatically registered. The registered files will have their build prefix replaced with the install prefix at install time. This works by padding the install prefix with null terminators, such that the length of the binary file remains the same. The build prefix must be long enough to accommodate any reasonable installation prefix. On macOS and Linux, rattler-build pads the build prefix to 255 characters by appending _placehold to the end of the build directory name.
"},{"location":"internals/#what-goes-into-a-package","title":"What goes into a package?","text":"

Generally speaking, any new files that are copied into the $PREFIX directory at build time are part of the new package. However, there is some filtering going on to exclude unwanted files, and noarch: python packages have special handling as well. The rules are as follows:

"},{"location":"internals/#filtering","title":"Filtering","text":""},{"location":"internals/#general-file-filtering","title":"General File Filtering","text":"

Certain files are filtered out to prevent them from being included in the package. These include:

"},{"location":"internals/#special-handling-for-noarch-python-packages","title":"Special Handling for noarch: python Packages","text":"

For packages marked as noarch: python, special transformations are applied to ensure compatibility across different platforms:

"},{"location":"internals/#symlink-handling","title":"Symlink Handling","text":"

Symlinks are carefully managed to ensure they are relative rather than absolute, which aids in making the package relocatable:

"},{"location":"package_spec/","title":"Package specification","text":"

rattler-build produces \"conda\" packages. These packages work with the mamba and conda package managers, and they work cross-platform on Windows, Linux and macOS.

By default, a conda package is a tar.bz2 archive which contains:

The format is identical across platforms and operating systems. During the install process, all files are extracted into the install prefix, except the ones in info/. Installing a conda package into an environment is similar to executing the following commands:

cd <environment prefix>\ntar xjf mypkg-1.0.0-h2134.tar.bz2\n

Only files, including symbolic links, are part of a conda package. Directories are not included. Directories are created and removed as needed, but you cannot create an empty directory from the tar archive directly.

There is also a newer archive type, suffixed with .conda. This archive type consists of an outer \"zip\" archive that is not compressed, and two inner archives that are compressed with zstd, which is very fast for decompression.

The inner archives are split into info and pkg files, which makes it possible to extract only the info part of the archive (only the metadata), which is often smaller in size.

"},{"location":"package_spec/#package-filename","title":"Package filename","text":"

A conda package conforms to the following filename:

<name>-<version>-<hash>.tar.bz2 OR <name>-<version>-<hash>.conda\n
"},{"location":"package_spec/#special-files-in-packages","title":"Special files in packages","text":"

There are some special files in a package:

You can read more about these files in the activation scripts and other special files section.

"},{"location":"package_spec/#package-metadata","title":"Package metadata","text":"

The info/ directory contains all metadata about a package. Files in this location are not installed under the install prefix. Although you are free to add any file to this directory, conda only inspects the content of the files discussed below:

"},{"location":"package_spec/#infoindexjson","title":"info/index.json","text":"

This file contains basic information about the package, such as name, version, build string, and dependencies. The content of this file is stored in repodata.json, which is the repository index file, hence the name index.json. The JSON object is a dictionary containing the keys shown below.

name: string

The lowercase name of the package. May contain lowercase characters, underscores, and dashes.

version: string

The package version. May not contain \"-\". Acknowledges PEP 440.

build: string

The build string. May not contain \"-\". Differentiates builds of packages with otherwise identical names and versions, such as:

build_number: integer

A non-negative integer representing the build number of the package. Unlike the build string, the build_number is inspected by conda. Conda uses it to sort packages that have otherwise identical names and versions to determine the latest one. This is important because new builds that contain bug fixes for the way a package is built may be added to a repository.

depends: list of match specs

A list of dependency specifications, where each element is a string. These come from the run section of the recipe or any run exports of dependencies.

constrains: list of match specs

A list of optional dependency constraints. The packages listed under constrains are not installed by default, but if they are installed they have to respect the constraints.

subdir: string

The subdir (like linux-64) of this package.

arch: string

Optional. The architecture the package is built for. EXAMPLE: x86_64. This key is generally not used (duplicate information from sudir).

platform: string

Optional. The OS that the package is built for, e.g. osx. This key is generally not used (duplicate information from sudir).

"},{"location":"package_spec/#infopathsjson","title":"info/paths.json","text":"

The paths.json file lists all files that are installed into the environment.

It consists of a list of path entries, each with the following keys:

_path: string

The relative path of the file

path_type: optional, string

The type of linking, can be hardlink, softlink, or directory. Default is hardlink.

file_mode: - optional, string

The file mode can be binary or text. This is only relevant for prefix replacement.

prefix_placeholder: optional, string

The prefix placeholder string that is encoded in the text or binary file, which is replaced at installation time. Note that this prefix placeholder uses / even on Windows.

no_link: bool, optional

Determines whether this file should be linked or not when installing the package (linking the file from the cache into the environment). Defaults to false.

sha256: string

The SHA256 hash of the file. For symbolic links it contains the SHA256 hash of the file pointed to.

size_in_bytes: number

The size, in bytes, of the file. For symbolic links, it contains the file size of the file pointed to.

Due to the way the binary replacement works, the placeholder prefix must be longer than the install prefix.

"},{"location":"package_spec/#infolicense","title":"info/license/<...>","text":"

All licenses mentioned in the recipe are copied to this folder.

"},{"location":"package_spec/#infoaboutjson","title":"info/about.json","text":"

Optional file. Contains the entries of the \"about\" section of the recipe of the recipe.yaml file. The following keys are added to info/about.json if present in the build recipe:

Renamed fields

The new recipe spec renamed a few fields (from conda-build's original implementation). This means that some fields in the about.json file still have the old names (for backwards compatibility), while you would generally use different names in the recipe.

home: url (from about.homepage)

The URL of the homepage of the package.

dev_url: url (from about.repository)

The URL of the development repository of the package.

doc_url: url (from about.documentation)

The URL of the documentation of the package.

license_url: url

The URL of the license of the package.

license: string (from about.license)

The SPDX license identifier of the package.

summary: string

A short summary of the package.

description: string

A longer description of the package.

license_family: string

(this field is not used anymore as we rely on SPDX license identifiers)

"},{"location":"package_spec/#inforecipe","title":"info/recipe/<...>","text":"

A directory containing the full contents of the build recipe. This folder also contains a rendered version of the recipe (rendered_recipe.yaml). This rendered version is used for the rebuild command. However, note that currently this format is still in flux and can change at any time.

You can also use --no-include-recipe to disable the inclusion of the recipe in the package.

"},{"location":"rebuild/","title":"Rebuilding a package","text":"

The rebuild command allows you to rebuild a package from an existing package. The main use case is to examine if a package can be rebuilt in a reproducible manner. You can read more about reproducible builds here.

"},{"location":"rebuild/#usage","title":"Usage","text":"
rattler-build rebuild ./mypkg-0.1.0-h60d57d3_0.tar.bz2\n
"},{"location":"rebuild/#how-it-works","title":"How it works","text":"

The recipe is \"rendered\" and stored into the package. The way the recipe is rendered is subject to change. For the moment, the rendered recipe is stored as info/recipe/rendered_recipe.yaml. It includes the exact package versions that were used at build time. When rebuilding, we use the package resolutions from the rendered recipe, and execute the same build script as the original package.

We also take great care to sort files in a deterministic manner as well as erasing any time stamps. The SOURCE_DATE_EPOCH environment variable is set to the same timestamp as the original build for additional determinism (some build tools use this variable to set timestamps).

"},{"location":"rebuild/#how-to-check-the-reproducibility-of-a-package","title":"How to check the reproducibility of a package","text":"

There is an excellent tool called diffoscope that allows you to compare two packages and see the differences. You can install it with pixi:

pixi global install diffoscope\n

To compare two packages, you can use the following command:

rattler-build rebuild ./build0.tar.bz2\ndiffoscope ./build0.tar.bz2 ./mypkg-0.1.0-h60d57d3_0.tar.bz2\n
"},{"location":"selectors/","title":"Selectors in recipes","text":"

Recipe and variant configuration files can utilize selectors to conditionally add, remove, or modify dependencies, configuration options, or even skip recipe execution based on specific conditions.

Selectors are implemented using a simple if / then / else map, which is a valid YAML dictionary. The condition is evaluated using minijinja and follows the same syntax as a Python expression.

During rendering, several variables are set based on the platform and variant being built. For example, the unix variable is true for macOS and Linux, while win is true for Windows. Consider the following recipe executed on Linux:

requirements:\n  host:\n    - if: unix\n      then: unix-tool\n    - if: win\n      then: win-tool\n

This will be evaluated as:

requirements:\n  host:\n    - unix-tool\n

The line containing the Windows-specific configuration is removed. Multiple items can also be selected, such as:

host:\n  - if: linux\n    then:\n    - linux-tool-1\n    - linux-tool-2\n    - linux-tool-3\n

For Linux, this will result in:

host:\n  - linux-tool-1\n  - linux-tool-2\n  - linux-tool-3\n

Other examples often found in the wild:

if: build_platform != target_platform ... # true if cross-platform build\nif: osx and arm64 ... # true for apple silicon (osx-arm64)\nif: linux and (aarch64 or ppc64le)) ... # true for linux ppc64le or linux-aarch64\n
"},{"location":"selectors/#available-variables","title":"Available variables","text":"

The following variables are available during the initial rendering and afterward:

Variable Description target_platform the configured target_platform for the build build_platform the build platform linux \"true\" if target_platform is Linux osx \"true\" if target_platform is OSX / macOS win \"true\" if target_platform is Windows unix \"true\" if target_platform is a Unix (macOS or Linux) x86_64, x86, arm64, ... The architecture (\"x86_64\" for 64 bit, \"x86\" for 32 bit, otherwise arm64, aarch64, ppc64le, etc.)

After the initial phase, when the variant configuration is selected, the variant values are also available in selectors. For example, if the build uses python: 3.8 as a variant, we can use if: python == \"3.8\" to enable a dependency for only when the Python version is 3.8.

"},{"location":"selectors/#the-cmp-function","title":"The cmp function","text":"

Inside selectors, one can use a special cmp function to test if the selected variant version has a matching version. For example, if we have again a python: 3.8 variant, we could use the following tests:

- if: cmp(python, \"3.8\")    # true\n  then: mydep\n- if: cmp(python, \">=3.8\")  # true\n  then: mydep\n- if: cmp(python, \"<3.8\")   # false\n  then: mydep\n

This function eliminates the need to implement any Python-specific conda-build selectors (such as py3k, py38, etc.) or the py and npy integers.

Please note that during the initial phase of rendering we do not know the variant, and thus the cmp condition always evaluates to true.

"},{"location":"special_files/","title":"Activation scripts and other special files","text":"

A conda package can contain \"special\" files in the prefix. These files are scripts that are executed during activation, installation, or uninstallation process.

If possible, they should be avoided since they execute arbitrary code at installation time and slow down the installation and activation process.

"},{"location":"special_files/#activation-scripts","title":"Activation scripts","text":"

The activation scripts are executed when the environment containing the package is activated (e.g. when doing micromamba activate myenv or pixi run ...).

The scripts are located in special folders:

The scripts are executed in lexicographical order, so you can prefix them with numbers to control the order of execution.

To add a script to the package, just make sure that you install the file in this folder. For example, on Linux:

mkdir -p $PREFIX/etc/conda/activate.d\ncp activate-mypkg.sh $PREFIX/etc/conda/activate.d/10-activate-mypkg.sh\n\nmkdir -p $PREFIX/etc/conda/deactivate.d\ncp deactivate-mypkg.sh $PREFIX/etc/conda/deactivate.d/10-deactivate-mypkg.sh\n
"},{"location":"special_files/#post-link-and-pre-unlink-scripts","title":"Post-link and pre-unlink scripts","text":"

The post-link and pre-unlink scripts are executed when the package is installed or uninstalled. They are both heavily discouraged and currently not implemented in rattler, rattler-build, and pixi.

To create a post-link script for your package, you need to add <package_name>-post-link.{sh/bat} to the bin/ folder of your package. The same is applicable for pre-unlink scripts, just with the name <package_name>-pre-unlink.{sh/bat}.

For example, for mypkg, you would add mypkg-post-link.sh to the bin/ folder of your package.

"},{"location":"testing/","title":"Testing packages","text":"

When you are developing a package, you should write tests for it. The tests are automatically executed right after the package build has finished.

The tests from the test section are actually packaged into your package and can also be executed straight from the existing package.

The idea behind adding the tests into the package is that you can execute the tests independently from building the package. That is also why we are shipping a test subcommand that takes as input an existing package and executes the tests:

rattler-build test --package-file ./xtensor-0.24.6-h60d57d3_0.tar.bz2\n

Running the above command will extract the package and create a clean environment where the package and dependencies are installed. Then the tests are executed in this newly-created environment.

If you inspect the package contents, you would find the test files under info/test/*.

"},{"location":"testing/#how-tests-are-translated","title":"How tests are translated","text":"

The tests section allows you to specify the following things:

tests:\n  - script:\n      # commands to run to test the package. If any of the commands\n      # returns with an error code, the test is considered failed.\n      - echo \"Hello world\"\n      - pytest ./tests\n\n    # additional requirements at test time\n    requirements:\n      run:\n        - pytest\n\n    files:\n      # Extra files to be copied to the test directory from the \"work directory\"\n      source:\n        - tests/\n        - test.py\n        - *.sh\n      recipe:\n        - more_tests/*.py\n\n  # This test section tries to import the Python modules and errors if it can't\n  - python:\n      imports:\n        - mypkg\n        - mypkg.subpkg\n

When you are writing a test for your package, additional files are created and added to your package. These files are placed under the info/tests/{index}/ folder for each test.

For a script test:

For a Python import test:

For a downstream test:

"},{"location":"testing/#legacy-tests","title":"Legacy tests","text":"

Legacy tests (from conda-build) are still supported for execution. These tests are stored as files under the info/test/ folder.

The files are:

Additionally, the info/test/ folder contains all the files specified in the test section as source_files and files. The tests are executed pointing to this directory as the current working directory.

"},{"location":"tui/","title":"Terminal User Interface","text":"

rattler-build offers a terminal user interface for building multiple packages and viewing the logs.

To launch the TUI, run the build command with the --tui flag as shown below:

$ rattler-build build -r recipe.yaml --tui\n

Note

rattler-build-tui is gated behind the tui feature flag to avoid extra dependencies. Build the project with --features tui arguments to enable the TUI functionality.

"},{"location":"tui/#key-bindings","title":"Key Bindings","text":"Key Action \u23ce Build a Build all j/k Next/previous package up/down/left/right Scroll logs e Edit recipe (via $EDITOR) c, : Open command prompt (available commands: edit) q, ctrl-c, esc, Quit"},{"location":"variants/","title":"Variant configuration","text":"

rattler-build can automatically build multiple variants of a given package. For example, a Python package might need multiple variants per Python version (especially if it is a binary package such as numpy).

For this use case, one can specify variant configuration files. A variant configuration file has 2 special entries and a list of packages with variants. For example:

variants.yaml
# special entry #1, the zip keys\nzip_keys:\n- [python, numpy]\n\n# special entry #2, the pin_run_as_build key\npin_run_as_build:\n  numpy:\n    max_pin: 'x.x'\n\n# entries per package version that users are interested in\npython:\n# Note that versions are _strings_ (not numbers)\n- \"3.8\"\n- \"3.9\"\n- \"3.10\"\n\nnumpy:\n- \"1.12\"\n- \"1.12\"\n- \"1.20\"\n

We can pass a variant configuration file to rattler-build using a command like the following:

rattler-build build --variant-config ./variants.yaml --recipe myrecipe.yaml\n

If we have a recipe, that has a build, host or run dependency on python we will build multiple variants of this package, one for each configured python version (\"3.8\", \"3.9\" and \"3.10\").

For example:

# ...\nrequirements:\n  host:\n  - python\n

... will be rendered as (for the first variant):

# ...\nrequirements:\n  host:\n- python 3.8*\n

Note that variants are only applied if the requirement doesn't specify any constraints. If the requirement would be python >3.8,<3.10 then the variant entry would be ignored.

"},{"location":"variants/#package-hash-from-variant","title":"Package hash from variant","text":"

You might have wondered what the role of the build string is. The build string is (if not explicitly set) computed from the variant configuration. It serves as a mechanism to discern different build configurations that produce a package with the same name and version.

The hash is computed by dumping all of the variant configuration values that are used by a given recipe into a JSON file, and then hashing that JSON file.

For example, in our python example, we would get a variant configuration file that looks something like:

{\n    \"python\": \"3.8\"\n}\n

This JSON string is then hashed with the MD5 hash algorithm, and produces the hash. For certain packages (such as Python packages) special rules exists, and the py<Major.Minor> version is prepended to the hash, so that the final hash would look something like py38h123123.

"},{"location":"variants/#zip-keys","title":"Zip keys","text":"

Zip keys modify how variants are combined. Usually, each variant key that has multiple entries is expanded to a build matrix. For example, if we have:

python: [\"3.8\", \"3.9\"]\nnumpy: [\"1.12\", \"1.14\"]\n

...then we obtain 4 variants for a recipe that uses both numpy and python:

- python 3.8, numpy 1.12\n- python 3.8, numpy 1.14\n- python 3.9, numpy 1.12\n- python 3.9, numpy 1.14\n

However, if we use the zip_keys and specify:

zip_keys: [\"python\", \"numpy\"]\npython: [\"3.8\", \"3.9\"]\nnumpy: [\"1.12\", \"1.14\"]\n

...then the versions are \"zipped up\" and we only get 2 variants. Note that both python and numpy need to specify the exact same number of versions to make this work.

The resulting variants with the zip applied are:

- python 3.8, numpy 1.12\n- python 3.9, numpy 1.14\n
"},{"location":"variants/#pin-run-as-build","title":"Pin run as build","text":"

The pin_run_as_build key allows the user to inject additional pins. Usually, the run_exports mechanism is used to specify constraints for runtime dependencies from build time dependencies, but pin_run_as_build offers a mechanism to override that if the package does not contain a run exports file.

For example:

pin_run_as_build:\n  libcurl:\n    min_pin: 'x'\n    max_pin: 'x'\n

If we now have a recipe that uses libcurl in the host and run dependencies like:

requirements:\n  host:\n  - libcurl\n  run:\n  - libcurl\n

During resolution, libcurl might be evaluated to libcurl 8.0.1 h13284. Our new runtime dependency then looks like:

requirements:\n  host:\n  - libcurl 8.0.1 h13284\n  run:\n  - libcurl >=8,<9\n
"},{"location":"variants/#prioritizing-variants","title":"Prioritizing variants","text":"

You might produce multiple variants for a package, but want to define a priority for a given variant. The variant with the highest priority would be the default package that is selected by the resolver.

There are two mechanisms to make this possible: mutex packages and the down_prioritize_variant option in the recipe.

"},{"location":"variants/#the-down_prioritize_variant-option","title":"The down_prioritize_variant option","text":"

Note

It is not always necessary to use the down_prioritize_variant option - only if the solver has no other way to prefer a given variant. For example, if you have a package that has multiple variants for different Python versions, the solver will automatically prefer the variant with the highest Python version.

The down_prioritize_variant option allows you to specify a variant that should be down-prioritized. For example:

recipe.yaml
build:\n  variant_config:\n    use_keys:\n      # use cuda from the variant config, e.g. to build multiple CUDA variants\n      - cuda\n    # this will down-prioritize the cuda variant versus other variants of the package\n    down_prioritize_variant: ${{ 1 if cuda else 0 }}\n
"},{"location":"variants/#mutex-packages","title":"Mutex packages","text":"

Another way to make sure the right variants are selected are \"mutex\" packages. A mutex package is a package that is mutually exclusive. We use the fact that only one package of a given name can be installed at a time (the solver has to choose).

A mutex package might be useful to make sure that all packages that depend on BLAS are compiled against the same BLAS implementation. The mutex package will serve the purpose that \"openblas\" and \"mkl\" can never be installed at the same time.

We could define a BLAS mutex package like this:

variant_config.yaml
blas_variant:\n  - \"openblas\"\n  - \"mkl\"\n

And then the recipe.yaml for the mutex package could look like this:

recipe.yaml
package:\n  name: blas_mutex\n  version: 1.0\n\nbuild:\n  string: ${{ blas_variant }}${{ hash }}_${{ build_number }}\n  variant_config:\n    # make sure that `openblas` is preferred over `mkl`\n    down_prioritize_variant: ${{ 1 if blas_variant == \"mkl\" else 0 }}\n

This will create two package: blas_mutex-1.0-openblas and blas_mutex-1.0-mkl. Only one of these packages can be installed at a time because they share the same name. The solver will then only select one of these two packages.

The blas package in turn should have a run_export for the blas_mutex package, so that any package that links against blas also has a dependency on the correct blas_mutex package:

recipe.yaml
package:\n  name: openblas\n  version: 1.0\n\nrequirements:\n  # any package depending on openblas should also depend on the correct blas_mutex package\n  run_export:\n    # Add a run export on _any_ version of the blas_mutex package whose build string starts with \"openblas\"\n    - blas_mutex * openblas*\n

Then the recipe of a package that wants to build two variants, one for openblas and one for mkl could look like this:

recipe.yaml
package:\n  name: fastnumerics\n  version: 1.0\n\nrequirements:\n  host:\n    # build against both openblas and mkl\n    - ${{ blas_variant }}\n  run:\n    # implicitly adds the correct blas_mutex package through run exports\n    # - blas_mutex * ${{ blas_variant }}*\n
"},{"location":"reference/cli/","title":"Command-Line Help for rattler-build","text":"

This document contains the help content for the rattler-build command-line program.

"},{"location":"reference/cli/#rattler-build","title":"rattler-build","text":"

Usage: rattler-build [OPTIONS] [COMMAND]

"},{"location":"reference/cli/#subcommands","title":"Subcommands:","text":""},{"location":"reference/cli/#options","title":"Options:","text":""},{"location":"reference/cli/#build","title":"build","text":"

Build a package from a recipe

Usage: rattler-build build [OPTIONS]

"},{"location":"reference/cli/#options_1","title":"Options:","text":""},{"location":"reference/cli/#modifying-result","title":"Modifying result","text":""},{"location":"reference/cli/#test","title":"test","text":"

Run a test for a single package

This creates a temporary directory, copies the package file into it, and then runs the indexing. It then creates a test environment that installs the package and any extra dependencies specified in the package test dependencies file.

With the activated test environment, the packaged test files are run:

These test files are written at \"package creation time\" and are part of the package.

Usage: rattler-build test [OPTIONS] --package-file <PACKAGE_FILE>

"},{"location":"reference/cli/#options_2","title":"Options:","text":""},{"location":"reference/cli/#modifying-result_1","title":"Modifying result","text":""},{"location":"reference/cli/#rebuild","title":"rebuild","text":"

Rebuild a package from a package file instead of a recipe

Usage: rattler-build rebuild [OPTIONS] --package-file <PACKAGE_FILE>

"},{"location":"reference/cli/#options_3","title":"Options:","text":""},{"location":"reference/cli/#modifying-result_2","title":"Modifying result","text":""},{"location":"reference/cli/#upload","title":"upload","text":"

Upload a package

Usage: rattler-build upload [OPTIONS] [PACKAGE_FILES]... <COMMAND>

"},{"location":"reference/cli/#subcommands_1","title":"Subcommands:","text":""},{"location":"reference/cli/#arguments","title":"Arguments:","text":""},{"location":"reference/cli/#options_4","title":"Options:","text":""},{"location":"reference/cli/#modifying-result_3","title":"Modifying result","text":""},{"location":"reference/cli/#quetz","title":"quetz","text":"

Upload to aQuetz server. Authentication is used from the keychain / auth-file

Usage: rattler-build upload quetz [OPTIONS] --url <URL> --channel <CHANNEL>

"},{"location":"reference/cli/#options_5","title":"Options:","text":""},{"location":"reference/cli/#artifactory","title":"artifactory","text":"

Options for uploading to a Artifactory channel. Authentication is used from the keychain / auth-file

Usage: rattler-build upload artifactory [OPTIONS] --url <URL> --channel <CHANNEL>

"},{"location":"reference/cli/#options_6","title":"Options:","text":""},{"location":"reference/cli/#prefix","title":"prefix","text":"

Options for uploading to a prefix.dev server. Authentication is used from the keychain / auth-file

Usage: rattler-build upload prefix [OPTIONS] --channel <CHANNEL>

"},{"location":"reference/cli/#options_7","title":"Options:","text":""},{"location":"reference/cli/#anaconda","title":"anaconda","text":"

Options for uploading to a Anaconda.org server

Usage: rattler-build upload anaconda [OPTIONS] --owner <OWNER>

"},{"location":"reference/cli/#options_8","title":"Options:","text":""},{"location":"reference/cli/#completion","title":"completion","text":"

Generate shell completion script

Usage: rattler-build completion [OPTIONS]

"},{"location":"reference/cli/#options_9","title":"Options:","text":""},{"location":"reference/cli/#generate-recipe","title":"generate-recipe","text":"

Generate a recipe from PyPI or CRAN

Usage: rattler-build generate-recipe <SOURCE> <PACKAGE>

"},{"location":"reference/cli/#arguments_1","title":"Arguments:","text":""},{"location":"reference/cli/#auth","title":"auth","text":"

Handle authentication to external channels

Usage: rattler-build auth <COMMAND>

"},{"location":"reference/cli/#subcommands_2","title":"Subcommands:","text":""},{"location":"reference/cli/#login","title":"login","text":"

Store authentication information for a given host

Usage: rattler-build auth login [OPTIONS] <HOST>

"},{"location":"reference/cli/#arguments_2","title":"Arguments:","text":""},{"location":"reference/cli/#options_10","title":"Options:","text":""},{"location":"reference/cli/#logout","title":"logout","text":"

Remove authentication information for a given host

Usage: rattler-build auth logout <HOST>

"},{"location":"reference/cli/#arguments_3","title":"Arguments:","text":"

This document was generated automatically by clap-markdown.

"},{"location":"reference/jinja/","title":"Jinja functions that can be used in the recipe","text":"

rattler-build comes with a couple of useful helpers that can be used in the recipe.

"},{"location":"reference/jinja/#functions","title":"Functions","text":""},{"location":"reference/jinja/#the-compiler-function","title":"The compiler function","text":"

The compiler function can be used to put together a compiler that works for the current platform and the compilation \"target_platform\". The syntax looks like: ${{ compiler('c') }} where 'c' signifies the programming language that is used.

This function evaluates to <compiler>_<target_platform> <compiler_version>. For example, when compiling on linux and to linux-64, this function evaluates to gcc_linux-64.

The values can be influenced by the variant_configuration. The <lang>_compiler and <lang>_compiler_version variables are the keys with influence. See below for an example:

"},{"location":"reference/jinja/#usage-in-a-recipe","title":"Usage in a recipe","text":"recipe.yaml
requirements:\n  build:\n    - ${{ compiler('c') }}\n

With a corresponding variant_configuration:

variant_configuration.yaml
c_compiler:\n- clang\nc_compiler_version:\n- 9.0\n

The variables shown above would select the clang compiler in version 9.0. Note that the final output will still contain the target_platform, so that the full compiler will read clang_linux-64 9.0 when compiling with --target-platform linux-64.

rattler-build defines some default compilers for the following languages (inherited from conda-build):

"},{"location":"reference/jinja/#the-stdlib-function","title":"The stdlib function","text":"

The stdlib function closely mirrors the compiler function. It can be used to put together a standard library that works for the current platform and the compilation \"target_platform\".

Usage: ${{ stdlib('c') }}

Results in <stdlib>_<target_platform> <stdlib_version>. And uses the variant variables <lang>_stdlib and <lang>_stdlib_version to influence the output.

"},{"location":"reference/jinja/#usage-in-a-recipe_1","title":"Usage in a recipe:","text":"recipe.yaml
requirements:\n  build:\n    # these are usually paired!\n    - ${{ compiler('c') }}\n    - ${{ stdlib('c') }}\n

With a corresponding variant_configuration:

variant_configuration.yaml
# these are the values `conda-forge` uses in their pinning file\n# found at https://github.com/conda-forge/conda-forge-pinning-feedstock/blob/main/recipe/conda_build_config.yaml\nc_stdlib:\n- sysroot\nc_stdlib_version:\n- 2.17\n
"},{"location":"reference/jinja/#the-pin-functions","title":"The pin functions","text":"

A pin is created based on the version input (from a subpackage or a package resolution).

The pin functions take the following five arguments:

These \"pins\" are applied to the version input to create the lower and upper bounds. For example, if the version is 3.10.5 with min_pin=\"x.x\", max_pin=\"x.x.x\", the lower bound will be 3.10 and the upper bound will be 3.10.6.0a0. The max_pin will increment the last selected segment of the version by 1, and append .0a0 to the end to prevent any alpha versions from being selected.

If the last segment of the version contains a letter (e.g. 9e or 1.1.1j), then incrementing the version will set that letter to a, e.g. 9e will become 10a, and 1.1.1j will become 1.1.2a. In this case, also no 0a0 is appended to the end.

Sometimes you want to strongly connect your outputs. This can be achieved with the following input:

To override the lower or upper bound with a hard-coded value, you can use the following input:

Both lower_bound and upper_bound expect a valid version string (e.g. 1.2.3).

"},{"location":"reference/jinja/#the-pin_subpackage-function","title":"The pin_subpackage function","text":""},{"location":"reference/jinja/#the-pin_compatible-function","title":"The pin_compatible function","text":"

The pin compatible function works exactly as the pin_subpackage function, but it pins the package in the run requirements based on the resolved package of the host or build section.

"},{"location":"reference/jinja/#the-cdt-function","title":"The cdt function","text":"

This function helps add Core Dependency Tree packages as dependencies by converting packages as required according to hard-coded logic. See below for an example of how this function can be used:

# on x86_64 system\ncdt('package-name') # outputs: package-name-cos6-x86_64\n# on aarch64 system\ncdt('package-name') # outputs: package-name-cos6-aarch64\n
"},{"location":"reference/jinja/#the-hash-variable","title":"The hash variable","text":""},{"location":"reference/jinja/#the-version_to_buildstring-function","title":"The version_to_buildstring function","text":""},{"location":"reference/jinja/#the-env-object","title":"The env object","text":"

You can use the env object to retrieve environment variables and forward them to your build script. There are two ways to do this:

You can also check for the existence of an environment variable:

"},{"location":"reference/jinja/#default-jinja-filters","title":"Default Jinja filters","text":"

The following Jinja filters are available: lower, upper, indexing into characters (e.g. https://myurl.com/{{ name[0] }}/{{ name | lower }}_${{ version }}.tar.gz).

Navigate to the Minijinja documentation for a list of all available built-in filters.

"},{"location":"reference/recipe_file/","title":"The recipe spec","text":"

rattler-build implements a new recipe spec, different from the traditional \"meta.yaml\" file used in conda-build. A recipe has to be stored as a recipe.yaml file.

"},{"location":"reference/recipe_file/#history","title":"History","text":"

A discussion was started on what a new recipe spec could or should look like. The fragments of this discussion can be found here.

The reason for a new spec are:

"},{"location":"reference/recipe_file/#major-differences-from-conda-build","title":"Major differences from conda-build","text":""},{"location":"reference/recipe_file/#spec","title":"Spec","text":"

The recipe spec has the following parts:

"},{"location":"reference/recipe_file/#spec-reference","title":"Spec reference","text":"

The spec is also made available through a JSON Schema (which is used for validation). The schema (and pydantic source file) can be found in this repository: recipe-format

To use with VSCode(yaml-plugin) and other IDEs:

Either start the document with the following line:

# yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\n
Or, using yaml.schemas,
yaml.schemas: {\n  \"https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\": \"**/recipe.yaml\",\n}\n
Read more about this here.

See more in the automatic linting chapter.

"},{"location":"reference/recipe_file/#examples","title":"Examples","text":"recipe.yaml
# this sets up \"context variables\" (in this case name and version) that\n# can later be used in Jinja expressions\ncontext:\n  version: 1.1.0\n  name: imagesize\n\n# top level package information (name and version)\npackage:\n  name: ${{ name }}\n  version: ${{ version }}\n\n# location to get the source from\nsource:\n  url: https://pypi.io/packages/source/${{ name[0] }}/${{ name }}/${{ name }}-${{ version }}.tar.gz\n  sha256: f3832918bc3c66617f92e35f5d70729187676313caa60c187eb0f28b8fe5e3b5\n\n# build number (should be incremented if a new build is made, but version is not incrementing)\nbuild:\n  number: 1\n  script: python -m pip install --no-deps --ignore-installed .\n\n# the requirements at build and runtime\nrequirements:\n  host:\n    - python\n    - pip\n  run:\n    - python\n\n# tests to validate that the package works as expected\ntests:\n  - python:\n      imports:\n        - imagesize\n\n# information about the package\nabout:\n  homepage: https://github.com/shibukawa/imagesize_py\n  license: MIT\n  summary: 'Getting image size from png/jpeg/jpeg2000/gif file'\n  description: |\n    This module analyzes jpeg/jpeg2000/png/gif image header and\n    return image size.\n  repository: https://github.com/shibukawa/imagesize_py\n  documentation: https://pypi.python.org/pypi/imagesize\n\n# the below is conda-forge specific!\nextra:\n  recipe-maintainers:\n    - somemaintainer\n
"},{"location":"reference/recipe_file/#package-section","title":"Package section","text":"

Specifies package information.

package:\n  name: bsdiff4\n  version: \"2.1.4\"\n
"},{"location":"reference/recipe_file/#source-section","title":"Source section","text":"

Specifies where the source code of the package is coming from. The source may come from a tarball file, git, hg, or svn. It may be a local path and it may contain patches.

"},{"location":"reference/recipe_file/#source-from-tarball-or-zip-archive","title":"Source from tarball or zip archive","text":"
source:\n  url: https://pypi.python.org/packages/source/b/bsdiff4/bsdiff4-1.1.4.tar.gz\n  md5: 29f6089290505fc1a852e176bd276c43\n  sha1: f0a2c9a30073449cfb7d171c57552f3109d93894\n  sha256: 5a022ff4c1d1de87232b1c70bde50afbb98212fd246be4a867d8737173cf1f8f\n

If an extracted archive contains only 1 folder at its top level, its contents will be moved 1 level up, so that the extracted package contents sit in the root of the work folder.

"},{"location":"reference/recipe_file/#source-from-git","title":"Source from git","text":"
source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  # branch: master # note: defaults to fetching the repo's default branch\n

You can use rev to pin the commit version directly:

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  rev: \"50a1f7ed6c168eb0815d424cba2df62790f168f0\"\n

Or you can use the tag:

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  tag: \"1.1.4\"\n

git can also be a relative path to the recipe directory:

source:\n  git: ../../bsdiff4/.git\n  tag: \"1.1.4\"\n

Futhermore, if you want to fetch just the current \"HEAD\" (this may result in non-deterministic builds), then you can use depth.

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  depth: 1 # note: the behaviour defaults to -1\n

Note: tag or rev may not be available within commit depth range, hence we don't allow using rev or the tag and depth of them together if not set to -1.

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  tag: \"1.1.4\"\n  depth: 1 # error: use of `depth` with `rev` is invalid, they are mutually exclusive\n

When you want to use git-lfs, you need to set lfs: true. This will also pull the lfs files from the repository.

source:\n  git: ../../bsdiff4/.git\n  tag: \"1.1.4\"\n  lfs: true # note: defaults to false\n
"},{"location":"reference/recipe_file/#source-from-a-local-path","title":"Source from a local path","text":"

If the path is relative, it is taken relative to the recipe directory. The source is copied to the work directory before building.

  source:\n    path: ../src\n    use_gitignore: false # note: defaults to true\n

By default, all files in the local path that are ignored by git are also ignored by rattler-build. You can disable this behavior by setting use_gitignore to false.

"},{"location":"reference/recipe_file/#patches","title":"Patches","text":"

Patches may optionally be applied to the source.

  source:\n    #[source information here]\n    patches:\n      - my.patch # the patch file is expected to be found in the recipe\n
"},{"location":"reference/recipe_file/#destination-path","title":"Destination path","text":"

Within rattler-build's work directory, you may specify a particular folder to place the source into. rattler-build will always drop you into the same folder ([build folder]/work), but it's up to you whether you want your source extracted into that folder, or nested deeper. This feature is particularly useful when dealing with multiple sources, but can apply to recipes with single sources as well.

source:\n  #[source information here]\n  target_directory: my-destination/folder\n
"},{"location":"reference/recipe_file/#source-from-multiple-sources","title":"Source from multiple sources","text":"

Some software is most easily built by aggregating several pieces.

The syntax is a list of source dictionaries. Each member of this list follows the same rules as the single source. All features for each member are supported.

Example:

source:\n  - url: https://package1.com/a.tar.bz2\n    target_directory: stuff\n  - url: https://package1.com/b.tar.bz2\n    target_directory: stuff\n  - git: https://github.com/mamba-org/boa\n    target_directory: boa\n

Here, the two URL tarballs will go into one folder, and the git repo is checked out into its own space. git will not clone into a non-empty folder.

"},{"location":"reference/recipe_file/#build-section","title":"Build section","text":"

Specifies build information.

Each field that expects a path can also handle a glob pattern. The matching is performed from the top of the build environment, so to match files inside your project you can use a pattern similar to the following one: \"**/myproject/**/*.txt\". This pattern will match any .txt file found in your project. Quotation marks (\"\") are required for patterns that start with a *.

Recursive globbing using ** is also supported.

"},{"location":"reference/recipe_file/#build-number-and-string","title":"Build number and string","text":"

The build number should be incremented for new builds of the same version. The number defaults to 0. The build string cannot contain \"-\". The string defaults to the default rattler-build build string plus the build number.

build:\n  number: 1\n  string: abc\n
"},{"location":"reference/recipe_file/#dynamic-linking","title":"Dynamic linking","text":"

This section contains settings for the shared libraries and executables.

build:\n  dynamic_linking:\n    rpath_allowlist: [\"/usr/lib/**\"]\n
"},{"location":"reference/recipe_file/#python-entry-points","title":"Python entry points","text":"

The following example creates a Python entry point named \"bsdiff4\" that calls bsdiff4.cli.main_bsdiff4().

build:\n  python:\n    entry_points:\n      - bsdiff4 = bsdiff4.cli:main_bsdiff4\n      - bspatch4 = bsdiff4.cli:main_bspatch4\n
"},{"location":"reference/recipe_file/#script","title":"Script","text":"

By default, rattler-build uses a build.sh file on Unix (macOS and Linux) and a build.bat file on Windows, if they exist in the same folder as the recipe.yaml file. With the script parameter you can either supply a different filename or write out short build scripts. You may need to use selectors to use different scripts for different platforms.

build:\n  # A very simple build script\n  script: pip install .\n\n  # The build script can also be a list\n  script:\n    - pip install .\n    - echo \"hello world\"\n    - if: unix\n      then:\n        - echo \"unix\"\n
"},{"location":"reference/recipe_file/#skipping-builds","title":"Skipping builds","text":"

Lists conditions under which rattler-build should skip the build of this recipe. Particularly useful for defining recipes that are platform-specific. By default, a build is never skipped.

build:\n  skip:\n    - win\n    ...\n
"},{"location":"reference/recipe_file/#architecture-independent-packages","title":"Architecture-independent packages","text":"

Allows you to specify \"no architecture\" when building a package, thus making it compatible with all platforms and architectures. Architecture-independent packages can be installed on any platform.

Assigning the noarch key as generic tells conda to not try any manipulation of the contents.

build:\n  noarch: generic\n

noarch: generic is most useful for packages such as static JavaScript assets and source archives. For pure Python packages that can run on any Python version, you can use the noarch: python value instead:

build:\n  noarch: python\n

Note

At the time of this writing, noarch packages should not make use of preprocess-selectors: noarch packages are built with the directives which evaluate to true in the platform it is built on, which probably will result in incorrect/incomplete installation in other platforms.

"},{"location":"reference/recipe_file/#include-build-recipe","title":"Include build recipe","text":"

The recipe and rendered recipe.yaml file are included in the package_metadata by default. You can disable this by passing --no-include-recipe on the command line.

Note

There are many more options in the build section. These additional options control how variants are computed, prefix replacements, and more. See the full build options for more information.

"},{"location":"reference/recipe_file/#requirements-section","title":"Requirements section","text":"

Specifies the build and runtime requirements. Dependencies of these requirements are included automatically.

Versions for requirements must follow the conda/mamba match specification. See build-version-spec.

"},{"location":"reference/recipe_file/#build","title":"Build","text":"

Tools required to build the package.

These packages are run on the build system and include things such as version control systems (git, svn) make tools (GNU make, Autotool, CMake) and compilers (real cross, pseudo-cross, or native when not cross-compiling), and any source pre-processors.

Packages which provide \"sysroot\" files, like the CDT packages (see below), also belong in the build section.

requirements:\n  build:\n    - git\n    - cmake\n
"},{"location":"reference/recipe_file/#host","title":"Host","text":"

Represents packages that need to be specific to the target platform when the target platform is not necessarily the same as the native build platform. For example, in order for a recipe to be \"cross-capable\", shared libraries requirements must be listed in the host section, rather than the build section, so that the shared libraries that get linked are ones for the target platform, rather than the native build platform. You should also include the base interpreter for packages that need one. In other words, a Python package would list python here and an R package would list mro-base or r-base.

requirements:\n  build:\n    - ${{ compiler('c') }}\n    - if: linux\n      then:\n        - ${{ cdt('xorg-x11-proto-devel') }}\n  host:\n    - python\n

Note

When both \"build\" and \"host\" sections are defined, the build section can be thought of as \"build tools\" - things that run on the native platform, but output results for the target platform (e.g. a cross-compiler that runs on linux-64, but targets linux-armv7).

The PREFIX environment variable points to the host prefix. With respect to activation during builds, both the host and build environments are activated. The build prefix is activated before the host prefix so that the host prefix has priority over the build prefix. Executables that don't exist in the host prefix should be found in the build prefix.

The build and host prefixes are always separate when both are defined, or when ${{ compiler() }} Jinja2 functions are used. The only time that build and host are merged is when the host section is absent, and no ${{ compiler() }} Jinja2 functions are used in meta.yaml.

"},{"location":"reference/recipe_file/#run","title":"Run","text":"

Packages required to run the package.

These are the dependencies that are installed automatically whenever the package is installed. Package names should follow the package match specifications.

requirements:\n  run:\n    - python\n    - six >=1.8.0\n

To build a recipe against different versions of NumPy and ensure that each version is part of the package dependencies, list numpy as a requirement in recipe.yaml and use a conda_build_config.yaml file with multiple NumPy versions.

"},{"location":"reference/recipe_file/#run-constraints","title":"Run constraints","text":"

Packages that are optional at runtime but must obey the supplied additional constraint if they are installed.

Package names should follow the package match specifications.

requirements:\n  run_constraints:\n    - optional-subpackage ==${{ version }}\n

For example, let's say we have an environment that has package \"a\" installed at version 1.0. If we install package \"b\" that has a run_constraints entry of \"a >1.0\", then mamba would need to upgrade \"a\" in the environment in order to install \"b\".

This is especially useful in the context of virtual packages, where the run_constraints dependency is not a package that mamba manages, but rather a virtual package that represents a system property that mamba can't change. For example, a package on Linux may impose a run_constraints dependency on __glibc >=2.12. This is the version bound consistent with CentOS 6. Software built against glibc 2.12 will be compatible with CentOS 6. This run_constraints dependency helps mamba, conda or pixi tell the user that a given package can't be installed if their system glibc version is too old.

"},{"location":"reference/recipe_file/#run-exports","title":"Run exports","text":"

Packages may have runtime requirements such as shared libraries (e.g. zlib), which are required for linking at build time, and for resolving the link at run time. Such packages use run_exports for defining the runtime requirements to let the dependent packages understand the runtime requirements of the package.

Example from zlib:

  requirements:\n    run_exports:\n      - ${{ pin_subpackage('libzlib', exact=True) }}\n

Run exports are weak by default. But you can also define strong run_exports.

  requirements:\n    run_exports:\n      strong:\n        - ${{ pin_subpackage('libzlib', exact=True) }}\n
"},{"location":"reference/recipe_file/#ignore-run-exports","title":"Ignore run exports","text":"

There maybe cases where an upstream package has a problematic run_exports constraint. You can ignore it in your recipe by listing the upstream package name in the ignore_run_exports section in requirements.

You can ignore them by package name, or by naming the runtime dependency directly.

  requirements:\n    ignore_run_exports:\n      from_package:\n        - zlib\n

Using a runtime depenedency name:

  requirements:\n    ignore_run_exports:\n      from_name:\n        - libzlib\n

Note

ignore_run_exports only applies to runtime dependencies coming from an upstream package.

"},{"location":"reference/recipe_file/#tests-section","title":"Tests section","text":"

rattler-build supports four different types of tests. The \"script test\" installs the package and runs a list of commands. The \"Python test\" attempts to import a list of Python modules and runs pip check. The \"downstream test\" runs the tests of a downstream package that reverse depends on the package being built. And lastly, the \"package content test\" checks if the built package contains the mentioned items.

The tests section is a list of these items:

tests:\n  - script:\n      - echo \"hello world\"\n    requirements:\n      run:\n        - pytest\n    files:\n      source:\n        - test-data.txt\n\n  - python:\n      imports:\n        - bsdiff4\n      pip_check: true  # this is the default\n  - downstream: numpy\n
"},{"location":"reference/recipe_file/#script-test","title":"Script test","text":"

The script test has 3 top-level keys: script, files and requirements. Only the script key is required.

"},{"location":"reference/recipe_file/#test-commands","title":"Test commands","text":"

Commands that are run as part of the test.

tests:\n  - script:\n      - echo \"hello world\"\n      - bsdiff4 -h\n      - bspatch4 -h\n
"},{"location":"reference/recipe_file/#extra-test-files","title":"Extra test files","text":"

Test files that are copied from the source work directory into the temporary test directory and are needed during testing (note that the source work directory is otherwise not available at all during testing).

You can also include files that come from the recipe folder. They are copied into the test directory as well.

At test execution time, the test directory is the current working directory.

tests:\n  - script:\n      - ls\n    files:\n      source:\n        - myfile.txt\n        - tests/\n        - some/directory/pattern*.sh\n      recipe:\n        - extra-file.txt\n
"},{"location":"reference/recipe_file/#test-requirements","title":"Test requirements","text":"

In addition to the runtime requirements, you can specify requirements needed during testing. The runtime requirements that you specified in the \"run\" section described above are automatically included during testing (because the built package is installed as it regularly would be).

In the build section you can specify additional requirements that are only needed on the build system for cross-compilation (e.g. emulators or compilers).

tests:\n  - script:\n      - echo \"hello world\"\n    requirements:\n      build:\n        - myemulator\n      run:\n        - nose\n
"},{"location":"reference/recipe_file/#python-tests","title":"Python tests","text":"

For this test type you can list a set of Python modules that need to be importable. The test will fail if any of the modules cannot be imported.

The test will also automatically run pip check to check for any broken dependencies. This can be disabled by setting pip_check: false in the YAML.

tests:\n  - python:\n      imports:\n        - bsdiff4\n        - bspatch4\n      pip_check: true  # can be left out because this is the default\n

Internally this will write a small Python script that imports the modules:

import bsdiff4\nimport bspatch4\n
"},{"location":"reference/recipe_file/#check-for-package-contents","title":"Check for package contents","text":"

Checks if the built package contains the mentioned items. These checks are executed directly at the end of the build process to make sure that all expected files are present in the package.

tests:\n  - package_contents:\n      # checks for the existence of files inside $PREFIX or %PREFIX%\n      # or, checks that there is at least one file matching the specified `glob`\n      # pattern inside the prefix\n      files:\n        - etc/libmamba/test.txt\n        - etc/libmamba\n        - etc/libmamba/*.mamba.txt\n\n      # checks for the existence of `mamba/api/__init__.py` inside of the\n      # Python site-packages directory (note: also see Python import checks)\n      site_packages:\n        - mamba.api\n\n\n      # looks in $PREFIX/bin/mamba for unix and %PREFIX%\\Library\\bin\\mamba.exe on Windows\n      # note: also check the `commands` and execute something like `mamba --help` to make\n      # sure things work fine\n      bin:\n        - mamba\n\n      # searches for `$PREFIX/lib/libmamba.so` or `$PREFIX/lib/libmamba.dylib` on Linux or macOS,\n      # on Windows for %PREFIX%\\Library\\lib\\mamba.dll & %PREFIX%\\Library\\bin\\mamba.bin\n      lib:\n        - mamba\n\n      # searches for `$PREFIX/include/libmamba/mamba.hpp` on unix, and\n      # on Windows for `%PREFIX%\\Library\\include\\libmamba\\mamba.hpp`\n      include:\n        - libmamba/mamba.hpp\n
"},{"location":"reference/recipe_file/#downstream-tests","title":"Downstream tests","text":"

Warning

Downstream tests are not yet implemented in rattler-build.

A downstream test can mention a single package that has a dependency on the package being built. The test will install the package and run the tests of the downstream package with our current package as a dependency.

Sometimes downstream packages do not resolve. In this case, the test is ignored.

tests:\n  - downstream: numpy\n
"},{"location":"reference/recipe_file/#outputs-section","title":"Outputs section","text":"

Explicitly specifies packaging steps. This section supports multiple outputs, as well as different package output types. The format is a list of mappings.

When using multiple outputs, certain top-level keys are \"forbidden\": package and requirements. Instead of package, a top-level recipe key can be defined. The recipe.name is ignored but the recipe.version key is used as default version for each output. Other \"top-level\" keys are merged into each output (e.g. the about section) to avoid repetition. Each output is a complete recipe, and can have its own build, requirements, and test sections.

recipe:\n  # the recipe name is ignored\n  name: some\n  version: 1.0\n\noutputs:\n  - package:\n      # version is taken from recipe.version (1.0)\n      name: some-subpackage\n\n  - package:\n      name: some-other-subpackage\n      version: 2.0\n

Each output acts like an independent recipe and can have their own script, build_number, and so on.

outputs:\n  - package:\n      name: subpackage-name\n    build:\n      script: install-subpackage.sh\n

Each output is built independently. You should take care of not packaging the same files twice.

"},{"location":"reference/recipe_file/#subpackage-requirements","title":"Subpackage requirements","text":"

Like a top-level recipe, a subpackage may have zero or more dependencies listed as build, host or run requirements.

The dependencies listed as subpackage build requirements are available only during the packaging phase of that subpackage.

outputs:\n  - package:\n      name: subpackage-name\n    requirements:\n      build:\n        - some-dep\n      run:\n        - some-dep\n

You can also use the pin_subpackage function to pin another output from the same recipe.

outputs:\n  - package:\n      name: libtest\n  - package:\n      name: test\n    requirements:\n      build:\n        - ${{ pin_subpackage('libtest', max_pin='x.x') }}\n

The outputs are topologically sorted by the dependency graph which is taking the pin_subpackage invocations into account. When using pin_subpackage(name, exact=True) a special behavior is used where the name package is injected as a \"variant\" and the variant matrix is expanded appropriately. For example, when you have the following situation, with a variant_config.yaml file that contains openssl: [1, 3]:

outputs:\n  - package:\n      name: libtest\n    requirements:\n      host:\n        - openssl\n  - package:\n      name: test\n    requirements:\n      build:\n        - ${{ pin_subpackage('libtest', exact=True) }}\n

Due to the variant config file, this will build two versions of libtest. We will also build two versions of test, one that depends on libtest (openssl 1) and one that depends on libtest (openssl 3).

"},{"location":"reference/recipe_file/#about-section","title":"About section","text":"

Specifies identifying information about the package. The information displays in the package server.

about:\n  homepage: https://example.com/bsdiff4\n  license: BSD-3-Clause # (1)!\n  license_file: LICENSE\n  summary: binary diff and patch using the BSDIFF4-format\n  description: |\n    Long description of bsdiff4 ...\n  repository: https://github.com/ilanschnell/bsdiff4\n  documentation: https://docs.com\n
  1. Only the SPDX specifiers are allowed, more info here: SPDX If you want another license type LicenseRef-<YOUR-LICENSE> can be used, e.g. license: LicenseRef-Proprietary
"},{"location":"reference/recipe_file/#license-file","title":"License file","text":"

Adds a file containing the software license to the package metadata. Many licenses require the license statement to be distributed with the package. The filename is relative to the source or recipe directory. The value can be a single filename or a YAML list for multiple license files. Values can also point to directories with license information. Directory entries must end with a / suffix (this is to lessen unintentional inclusion of non-license files; all the directory's contents will be unconditionally and recursively added).

about:\n  license_file:\n    - LICENSE\n    - vendor-licenses/\n
"},{"location":"reference/recipe_file/#extra-section","title":"Extra section","text":"

A schema-free area for storing non-conda-specific metadata in standard YAML form.

Example: To store recipe maintainers information
extra:\n  maintainers:\n   - name of maintainer\n
"},{"location":"reference/recipe_file/#templating-with-jinja","title":"Templating with Jinja","text":"

rattler-build supports limited Jinja templating in the recipe.yaml file.

You can set up Jinja variables in the context section:

context:\n  name: \"test\"\n  version: \"5.1.2\"\n  # later keys can reference previous keys\n  # and use jinja functions to compute new values\n  major_version: ${{ version.split('.')[0] }}\n

Later in your recipe.yaml you can use these values in string interpolation with Jinja:

source:\n  url: https://github.com/mamba-org/${{ name }}/v${{ version }}.tar.gz\n

Jinja has built-in support for some common string manipulations.

In rattler-build, complex Jinja is completely disallowed as we try to produce YAML that is valid at all times. So you should not use any {% if ... %} or similar Jinja constructs that produce invalid YAML. Furthermore, instead of plain double curly brackets Jinja statements need to be prefixed by $, e.g. ${{ ... }}:

package:\n  name: {{ name }}   # WRONG: invalid yaml\n  name: ${{ name }} # correct\n

For more information, see the Jinja template documentation and the list of available environment variables env-vars.

Jinja templates are evaluated during the build process.

"},{"location":"reference/recipe_file/#additional-jinja2-functionality-in-rattler-build","title":"Additional Jinja2 functionality in rattler-build","text":"

Besides the default Jinja2 functionality, additional Jinja functions are available during the rattler-build process: pin_compatible, pin_subpackage, and compiler.

The compiler function takes c, cxx, fortran and other values as argument and automatically selects the right (cross-)compiler for the target platform.

build:\n  - ${{ compiler('c') }}\n

The pin_subpackage function pins another package produced by the recipe with the supplied parameters.

Similarly, the pin_compatible function will pin a package according to the specified rules.

"},{"location":"reference/recipe_file/#pin-expressions","title":"Pin expressions","text":"

rattler-build knows pin expressions. A pin expression can have a min_pin, max_pin and exact value. A max_pin and min_pin are specified with a string containing only x and ., e.g. max_pin=\"x.x.x\" would signify to pin the given package to <1.2.3 (if the package version is 1.2.2, for example).

A pin with min_pin=\"x.x\",max_pin=\"x.x\" for a package of version 1.2.2 would evaluate to >=1.2.2,<1.2.3.

If exact=true, then the hash is included, and the package is pinned exactly, e.g. ==1.2.2 h1234. This is a unique package variant that cannot exist more than once, and thus is \"exactly\" pinned.

"},{"location":"reference/recipe_file/#pin-subpackage","title":"Pin subpackage","text":"

Pin subpackage refers to another package from the same recipe file. It is commonly used in the build/run_exports section to export a run export from the package, or with multiple outputs to refer to a previous build.

It looks something like:

package:\n  name: mypkg\n  version: \"1.2.3\"\n\nrequirements:\n  run_exports:\n    # this will evaluate to `mypkg <1.3`\n    - ${{ pin_subpackage(name, max_pin='x.x') }}\n
"},{"location":"reference/recipe_file/#pin-compatible","title":"Pin compatible","text":"

Pin compatible lets you pin a package based on the version retrieved from the variant file (if the pinning from the variant file needs customization).

For example, if the variant specifies a pin for numpy: 1.11, one can use pin_compatible to relax it:

requirements:\n  host:\n    # this will select nupy 1.11\n    - numpy\n  run:\n    # this will export `numpy >=1.11,<2`, instead of the stricter `1.11` pin\n    - ${{ pin_compatible('numpy', min_pin='x.x', max_pin='x') }}\n
"},{"location":"reference/recipe_file/#the-env-jinja-functions","title":"The env Jinja functions","text":"

You can access the current environment variables using the env object in Jinja.

There are three functions:

This can be used for some light templating, for example:

build:\n  string: ${{ env.get(\"GIT_BUILD_STRING\") }}_${{ PKG_HASH }}\n
"},{"location":"reference/recipe_file/#cmp-function","title":"cmp function","text":"

This function matches the first argument (the package's MatchSpec) against the second argument (the version spec) and returns the resulting boolean.

cmp(python, '>=3.4')\n

Example: cmp usage example

"},{"location":"reference/recipe_file/#cdt-function","title":"cdt function","text":"

This function helps add Core Dependency Tree packages as dependencies by converting packages as required according to hard-coded logic.

# on x86_64 system\ncdt('package-name') # outputs: package-name-cos6-x86_64\n# on aarch64 system\ncdt('package-name') # outputs: package-name-cos6-aarch64\n

Example: cdt usage example

"},{"location":"reference/recipe_file/#preprocessing-selectors","title":"Preprocessing selectors","text":"

You can add selectors to any item, and the selector is evaluated in a preprocessing stage. If a selector evaluates to true, the item is flattened into the parent element. If a selector evaluates to false, the item is removed.

Selectors can use if ... then ... else as follows:

source:\n  - if: not win\n    then:\n      - url: http://path/to/unix/source\n    else:\n      - url: http://path/to/windows/source\n\n# or the equivalent with two if conditions:\n\nsource:\n  - if: unix\n    then:\n      - url: http://path/to/unix/source\n  - if: win\n    then:\n      - url: http://path/to/windows/source\n

A selector is a valid Python statement that is executed. You can read more about them in the \"Selectors in recipes\" chapter.

The use of the Python version selectors, py27, py34, etc. is discouraged in favor of the more general comparison operators. Additional selectors in this series will not be added to conda-build.

Because the selector is any valid Python expression, complicated logic is possible:

- if: unix and not win\n  then: ...\n- if: (win or linux) and not py27\n  then: ...\n

Lists are automatically \"merged\" upwards, so it is possible to group multiple items under a single selector:

tests:\n  - script:\n    - if: unix\n      then:\n      - test -d ${PREFIX}/include/xtensor\n      - test -f ${PREFIX}/lib/cmake/xtensor/xtensorConfigVersion.cmake\n    - if: win\n      then:\n      - if not exist %LIBRARY_PREFIX%\\include\\xtensor\\xarray.hpp (exit 1)\n      - if not exist %LIBRARY_PREFIX%\\lib\\cmake\\xtensor\\xtensorConfigVersion.cmake (exit 1)\n\n# On unix this is rendered to:\ntests:\n  - script:\n    - test -d ${PREFIX}/include/xtensor\n    - test -f ${PREFIX}/lib/cmake/xtensor/xtensorConfigVersion.cmake\n
"},{"location":"reference/recipe_file/#experimental-features","title":"Experimental features","text":"

Warning

These are experimental features of rattler-build and may change or go away completely.

"},{"location":"reference/recipe_file/#jinja-functions","title":"Jinja functions","text":""},{"location":"tutorials/cpp/","title":"Packaging a C++ package","text":"

This tutorial will guide you though making a C++ package with rattler-build.

"},{"location":"tutorials/cpp/#building-a-header-only-library","title":"Building a Header-only Library","text":"

To build a package for the header-only library xtensor, you need to manage dependencies and ensure proper installation paths.

"},{"location":"tutorials/cpp/#key-steps","title":"Key Steps","text":"
  1. Dependencies: Ensure cmake, ninja, and a compiler are available as dependencies.

  2. CMake Installation Prefix: Use the CMAKE_INSTALL_PREFIX setting to instruct CMake to install the headers in the correct location.

  3. Unix Systems: Follow the standard Unix prefix:

    $PREFIX/include\n$PREFIX/lib\n

  4. Windows Systems: Use a Unix-like prefix but nested in a Library directory:

    $PREFIX/Library/include\n$PREFIX/Library/lib\n
    Utilize the handy variables %LIBRARY_PREFIX% and %LIBRARY_BIN% to guide CMake to install the headers and libraries correctly.

This approach ensures that the headers and libraries are installed in the correct directories on both Unix and Windows systems.

"},{"location":"tutorials/cpp/#recipe","title":"Recipe","text":"recipe.yaml
context:\n  version: \"0.24.6\"\n\npackage:\n  name: xtensor\n  version: ${{ version }}\n\nsource:\n  url: https://github.com/xtensor-stack/xtensor/archive/${{ version }}.tar.gz\n  sha256: f87259b51aabafdd1183947747edfff4cff75d55375334f2e81cee6dc68ef655\n\nbuild:\n  number: 0\n  script:\n    - if: win # (1)!\n      then: |\n        cmake -GNinja ^\n            -D BUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% ^\n            %SRC_DIR%\n        ninja install\n      else: |\n        cmake -GNinja \\\n              -DBUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=$PREFIX \\\n              $SRC_DIR\n        ninja install\n\nrequirements:\n  build:\n    - ${{ compiler('cxx') }} # (2)!\n    - cmake\n    - ninja\n  host:\n    - xtl >=0.7,<0.8\n  run:\n    - xtl >=0.7,<0.8\n  run_constraints: # (3)!\n    - xsimd >=8.0.3,<10\n\ntests:\n  - package_contents:\n      include: # (4)!\n        - xtensor/xarray.hpp\n      files: # (5)!\n        - ${{ \"Library\" if win }}/share/cmake/xtensor/xtensorConfig.cmake\n        - ${{ \"Library\" if win }}/share/cmake/xtensor/xtensorConfigVersion.cmake\n\nabout:\n  homepage: https://github.com/xtensor-stack/xtensor\n  license: BSD-3-Clause\n  license_file: LICENSE\n  summary: The C++ tensor algebra library\n  description: Multi dimensional arrays with broadcasting and lazy computing\n  documentation: https://xtensor.readthedocs.io\n  repository: https://github.com/xtensor-stack/xtensor\n\nextra:\n  recipe-maintainers:\n    - some-maintainer\n
  1. The if: condition allows the user to switch behavior of the build based on some checks like, the operating system.
  2. The compiler function is used to get the C++ compiler for the build system.
  3. The run_constraints section specifies the version range of a package which the package can run \"with\". But which the package doesn't depend on itself.
  4. The include section specifies the header file to tested for existence.
  5. The files section specifies the files to be tested for existence, using a glob pattern.

CMAKE_ARGS

It can be tedious to remember all the diffent variables one needs to pass to CMake to create the perfect build. The cmake package on conda-forge introduces theCMAKE_ARGS environment variable. This variable contains the necessary flags to make the package build correctly, also when cross-compiling from one machine to another. Therefore, it is often not necesary to pass any additional flags to the cmake command. However, because this is a tutorial we will show how to pass the necessary flags to cmake manually.

For more information please refer to the conda-forge documentation.

"},{"location":"tutorials/cpp/#building-a-c-application","title":"Building A C++ application","text":"

In this example, we'll build poppler, a C++ application for manipulating PDF files from the command line. The final package will install several tools into the bin/ folder. We'll use external build scripts and run actual scripts in the test.

"},{"location":"tutorials/cpp/#key-steps_1","title":"Key Steps","text":"
  1. Dependencies:

    • Build Dependencies: These are necessary for the building process, including cmake, ninja, and pkg-config.
    • Host Dependencies: These are the libraries poppler links against, such as cairo, fontconfig, freetype, glib, and others.
  2. Compiler Setup: We use the compiler function to obtain the appropriate C and C++ compilers.

  3. Build Script: The build.script field points to an external script (poppler-build.sh) which contains the build commands.

  4. Testing: Simple tests are included to verify that the installed tools (pdfinfo, pdfunite, pdftocairo) are working correctly by running them, and expecting an exit code 0.

"},{"location":"tutorials/cpp/#recipe_1","title":"Recipe","text":"recipe.yaml
context:\n  version: \"24.01.0\"\n\npackage:\n  name: poppler\n  version: ${{ version }}\n\nsource:\n  url: https://poppler.freedesktop.org/poppler-${{ version }}.tar.xz\n  sha256: c7def693a7a492830f49d497a80cc6b9c85cb57b15e9be2d2d615153b79cae08\n\nbuild:\n  script: poppler-build.sh\n\nrequirements:\n  build:\n    - ${{ compiler('c') }} # (1)!\n    - ${{ compiler('cxx') }}\n    - pkg-config\n    - cmake\n    - ninja\n  host:\n    - cairo # (2)!\n    - fontconfig\n    - freetype\n    - glib\n    - libboost-headers\n    - libjpeg-turbo\n    - lcms2\n    - libiconv\n    - libpng\n    - libtiff\n    - openjpeg\n    - zlib\n\ntests:\n  - script:\n      - pdfinfo -listenc  # (3)!\n      - pdfunite --help\n      - pdftocairo --help\n
  1. The compiler jinja function to get the correct compiler for C and C++ on the build system.
  2. These are all the dependencies that the library links against.
  3. The script test just executes some of the installed tools to check if they are working. These can be as complex as you want. (bash or cmd.exe)
"},{"location":"tutorials/cpp/#external-build-script","title":"External Build Script","text":"

We've defined an external build script in the recipe. This will be searched next to the recipe by the file name given, or the default name build.sh on unix or build.bat on windows are searched for.

poppler-build.sh
#! /bin/bash\n\nextra_cmake_args=(\n    -GNinja\n    -DCMAKE_INSTALL_LIBDIR=lib\n    -DENABLE_UNSTABLE_API_ABI_HEADERS=ON\n    -DENABLE_GPGME=OFF\n    -DENABLE_LIBCURL=OFF\n    -DENABLE_LIBOPENJPEG=openjpeg2\n    -DENABLE_QT6=OFF\n    -DENABLE_QT5=OFF\n    -DENABLE_NSS3=OFF\n)\n\nmkdir build && cd build\n\ncmake ${CMAKE_ARGS} \"${extra_cmake_args[@]}\" \\\n    -DCMAKE_PREFIX_PATH=$PREFIX \\\n    -DCMAKE_INSTALL_PREFIX=$PREFIX \\\n    -DTIFF_INCLUDE_DIR=$PREFIX/include \\\n    $SRC_DIR\n\nninja\n\n# The `install` command will take care of copying the files to the right place\nninja install\n
"},{"location":"tutorials/cpp/#parsing-the-rattler-build-build-output","title":"Parsing the rattler-build build Output","text":"

When running the rattler-build command, you might notice some interesting information in the output. Our package will have some run dependencies, even if we didn't specify any.

These come from the run-exports of the packages listed in the host section of the recipe. This is indicated by \"RE of [host: package]\" in the output.

For example, libcurl specifies that if you depend on it in the host section, you should also depend on it during runtime with specific version ranges. This ensures proper linking to shared libraries.

Run dependencies:\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Name                  \u2506 Spec                                         \u2502\n\u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2561\n\u2502 libcurl               \u2506 >=8.5.0,<9.0a0 (RE of [host: libcurl])       \u2502\n\u2502 fontconfig            \u2506 >=2.14.2,<3.0a0 (RE of [host: fontconfig])   \u2502\n\u2502 fonts-conda-ecosystem \u2506 (RE of [host: fontconfig])                   \u2502\n\u2502 lcms2                 \u2506 >=2.16,<3.0a0 (RE of [host: lcms2])          \u2502\n\u2502 gettext               \u2506 >=0.21.1,<1.0a0 (RE of [host: gettext])      \u2502\n\u2502 freetype              \u2506 >=2.12.1,<3.0a0 (RE of [host: freetype])     \u2502\n\u2502 openjpeg              \u2506 >=2.5.0,<3.0a0 (RE of [host: openjpeg])      \u2502\n\u2502 libiconv              \u2506 >=1.17,<2.0a0 (RE of [host: libiconv])       \u2502\n\u2502 cairo                 \u2506 >=1.18.0,<2.0a0 (RE of [host: cairo])        \u2502\n\u2502 libpng                \u2506 >=1.6.42,<1.7.0a0 (RE of [host: libpng])     \u2502\n\u2502 libzlib               \u2506 >=1.2.13,<1.3.0a0 (RE of [host: zlib])       \u2502\n\u2502 libtiff               \u2506 >=4.6.0,<4.7.0a0 (RE of [host: libtiff])     \u2502\n\u2502 libjpeg-turbo         \u2506 >=3.0.0,<4.0a0 (RE of [host: libjpeg-turbo]) \u2502\n\u2502 libglib               \u2506 >=2.78.3,<3.0a0 (RE of [host: glib])         \u2502\n\u2502 libcxx                \u2506 >=16 (RE of [build: clangxx_osx-arm64])      \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n

You can also see \"linking\" information in the output, for example on macOS:

[lib/libpoppler-glib.8.26.0.dylib] links against:\n \u251c\u2500 @rpath/libgio-2.0.0.dylib\n \u251c\u2500 @rpath/libgobject-2.0.0.dylib\n \u251c\u2500 /usr/lib/libSystem.B.dylib\n \u251c\u2500 @rpath/libglib-2.0.0.dylib\n \u251c\u2500 @rpath/libpoppler.133.dylib\n \u251c\u2500 @rpath/libfreetype.6.dylib\n \u251c\u2500 @rpath/libc++.1.dylib\n \u251c\u2500 @rpath/libpoppler-glib.8.dylib\n \u2514\u2500 @rpath/libcairo.2.dylib\n

rattler-build ensures that:

  1. All shared libraries linked against are present in the run dependencies. Missing libraries trigger an overlinking warning.
  2. You don't require any packages in the host that you are not linking against. This triggers an overdepending warning.
"},{"location":"tutorials/python/","title":"Writing a Python package","text":"

Writing a Python package is fairly straightforward, especially for \"Python-only\" packages. In the second example we will build a package for numpy which contains compiled code.

"},{"location":"tutorials/python/#a-python-only-package","title":"A Python-only package","text":"

The following recipe uses the noarch: python setting to build a noarch package that can be installed on any platform without modification. This is very handy for packages that are pure Python and do not contain any compiled extensions.

Additionally, noarch: python packages work with a range of Python versions (contrary to packages with compiled extensions that are tied to a specific Python version).

recipe.yaml
context:\n  version: \"8.1.2\"\n\npackage:\n  name: ipywidgets\n  version: ${{ version }}\n\nsource:\n  url: https://pypi.io/packages/source/i/ipywidgets/ipywidgets-${{ version }}.tar.gz\n  sha256: d0b9b41e49bae926a866e613a39b0f0097745d2b9f1f3dd406641b4a57ec42c9\n\nbuild:\n  noarch: python # (1)!\n  script: pip install . -v\n\nrequirements:\n  # note that there is no build section\n  host:\n    - pip\n    - python >=3.7\n    - setuptools\n    - wheel\n  run:\n    - comm >=0.1.3\n    - ipython >=6.1.0\n    - jupyterlab_widgets >=3.0.10,<3.1.0\n    - python >=3.7\n    - traitlets >=4.3.1\n    - widgetsnbextension >=4.0.10,<4.1.0\n\ntests:\n  - python:\n      imports:\n        - ipywidgets # (2)!\n\nabout:\n  homepage: https://github.com/ipython/ipywidgets\n  license: BSD-3-Clause\n  license_file: LICENSE\n  summary: Jupyter Interactive Widgets\n  description: |\n    ipywidgets are interactive HTML widgets for Jupyter notebooks and the IPython kernel.\n  documentation: https://ipywidgets.readthedocs.io/en/latest/\n
  1. The noarch: python line tells rattler-build that this package is pure Python and can be one-size-fits-all. noarch packages can be installed on any platform without modification which is very handy.
  2. The imports section in the tests is used to check that the package is installed correctly and can be imported.
"},{"location":"tutorials/python/#running-the-recipe","title":"Running the recipe","text":"

To build this recipe, simply run:

rattler-build build --recipe ./ipywidgets\n
"},{"location":"tutorials/python/#a-python-package-with-compiled-extensions","title":"A Python package with compiled extensions","text":"

We will build a package for numpy \u2013 which contains compiled code. Since compiled code is python version-specific, we will need to specify the python version explicitly. The best way to do this is with a \"variant_config.yaml\" file:

variant_config.yaml
python:\n  - 3.11\n  - 3.12\n

This will replace any python found in the recipe with the versions specified in the variant_config.yaml file.

recipe.yaml
context:\n  version: 1.26.4\n\npackage:\n  name: numpy\n  version: ${{ version }}\n\nsource:\n  - url: https://github.com/numpy/numpy/releases/download/v${{ version }}/numpy-${{ version }}.tar.gz\n    sha256: 2a02aba9ed12e4ac4eb3ea9421c420301a0c6460d9830d74a9df87efa4912010\n\nbuild:\n  python:\n    entry_points:\n      - f2py = numpy.f2py.f2py2e:main  # [win]\n\nrequirements:\n  build:\n    - ${{ compiler('c') }}\n    - ${{ compiler('cxx') }}\n  host:\n    # note: variant is injected here!\n    - python\n    - pip\n    - meson-python\n    - ninja\n    - pkg-config\n    - python-build\n    - cython\n    - libblas\n    - libcblas\n    - liblapack\n  run:\n    - python\n  run_exports:\n    - ${{ pin_subpackage(\"numpy\") }}\n\ntests:\n  - python:\n      imports:\n        - numpy\n        - numpy.array_api\n        - numpy.array_api.linalg\n        - numpy.ctypeslib\n\n  - script:\n    - f2py -h\n\nabout:\n  homepage: http://numpy.org/\n  license: BSD-3-Clause\n  license_file: LICENSE.txt\n  summary: The fundamental package for scientific computing with Python.\n  documentation: https://numpy.org/doc/stable/\n  repository: https://github.com/numpy/numpy\n

The build script for Unix:

build.sh
mkdir builddir\n\n$PYTHON -m build -w -n -x \\\n    -Cbuilddir=builddir \\\n    -Csetup-args=-Dblas=blas \\\n    -Csetup-args=-Dlapack=lapack\n\n$PYTHON -m pip install dist/numpy*.whl\n

The build script for Windows:

build.bat
mkdir builddir\n\n%PYTHON% -m build -w -n -x ^\n    -Cbuilddir=builddir ^\n    -Csetup-args=-Dblas=blas ^\n    -Csetup-args=-Dlapack=lapack\nif %ERRORLEVEL% neq 0 exit 1\n\n:: `pip install dist\\numpy*.whl` does not work on windows,\n:: so use a loop; there's only one wheel in dist/ anyway\nfor /f %%f in ('dir /b /S .\\dist') do (\n    pip install %%f\n    if %ERRORLEVEL% neq 0 exit 1\n)\n
"},{"location":"tutorials/python/#running-the-recipe_1","title":"Running the recipe","text":"

Running this recipe with the variant config file will build a a total of 2 numpy packages:

rattler-build build --recipe ./numpy \\\n  --variant-config ./numpy/variant_config.yaml\n

At the beginning of the build process, rattler-build will print the following message to show you the variants it found:

Found variants:\n\nnumpy-1.26.4-py311h5f8ada8_0\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Variant         \u2506 Version   \u2502\n\u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2561\n\u2502 python          \u2506 3.11      \u2502\n\u2502 target_platform \u2506 osx-arm64 \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\nnumpy-1.26.4-py312h440f24a_0\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Variant         \u2506 Version   \u2502\n\u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2561\n\u2502 python          \u2506 3.12      \u2502\n\u2502 target_platform \u2506 osx-arm64 \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n
"},{"location":"tutorials/rust/","title":"Building a Rust package","text":"

We're using rattler-build to build a Rust package for the cargo-edit utility. This utility manages Cargo dependencies from the command line.

To configure the Rust compiler, we add a variant_config.yaml file to the package:

variant_config.yaml
rust_compiler: rust\n

This will tell rattler-build what to insert for the ${{ compiler('rust') }} Jinja function.

Note

The ${{ compiler(...) }} functions are very useful in the context of cross-compilation. When the function is evaluated it will insert the correct compiler (as selected with the variant config) as well the target_platform. The \"rendered\" compiler will look like rust_linux-64 when you are targeting the linux-64 platform.

You can read more about this in the cross-compilation section.

recipe.yaml
context:\n  version: \"0.11.9\"\n\npackage:\n  name: cargo-edit\n  version: ${{ version }}\n\nsource:\n  url: https://github.com/killercup/cargo-edit/archive/refs/tags/v${{ version }}.tar.gz\n  sha256: 46670295e2323fc2f826750cdcfb2692fbdbea87122fe530a07c50c8dba1d3d7\n\nbuild:\n  script:\n    - cargo-bundle-licenses --format yaml --output ${SRC_DIR}/THIRDPARTY.yml  # !(1)\n    - $BUILD_PREFIX/bin/cargo install --locked --bins --root ${PREFIX} --path .\n\nrequirements:\n  build:\n    - ${{ compiler('rust') }}\n    - cargo-bundle-licenses\n\ntests:\n  - script:\n      - cargo-upgrade --help # !(2)\n\nabout:\n  homepage: https://github.com/killercup/cargo-edit\n  license: MIT\n  license_file:\n    - LICENSE\n    - THIRDPARTY.yml\n  description: \"A utility for managing cargo dependencies from the command line.\"\n  summary: \"A utility for managing cargo dependencies from the command line.\"\n
  1. The cargo-bundle-licenses utility is used to bundle all the licenses of the dependencies into a THIRDPARTY.yml file. This file is then included in the package. You should always include this file in your package when you are redistributing it.
  2. Running scripts in bash or cmd.exe to test the package build well, expects an exit code of 0 to pass the test.

To build this recipe, simply run:

rattler-build build \\\n    --recipe ./cargo-edit/recipe.yaml \\\n    --variant-config ./cargo-edit/variant_config.yaml\n
"}]} \ No newline at end of file diff --git a/dev/sitemap.xml.gz b/dev/sitemap.xml.gz index 78de9aea9e376f0febfeb70b508af103dae1150d..3a037304cf3cf4a2e9a5e1e74913c89434efc247 100644 GIT binary patch delta 15 WcmZ3)yoi}izMF$1;KfF^Ig9`!Uj&x` delta 15 WcmZ3)yoi}izMF%C@61NFIg9`yM+7+l diff --git a/dev/tutorials/cpp/index.html b/dev/tutorials/cpp/index.html index 9fe8de51e..20adaf740 100644 --- a/dev/tutorials/cpp/index.html +++ b/dev/tutorials/cpp/index.html @@ -1153,13 +1153,13 @@

Recipe script: - if: win # (1)! then: | - cmake -GNinja \ - -D BUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% \ + cmake -GNinja ^ + -D BUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% ^ %SRC_DIR% ninja install else: | - cmake ${CMAKE_ARGS} -GNinja -DBUILD_TESTS=OFF \ - -DCMAKE_INSTALL_PREFIX=$PREFIX \ + cmake -GNinja \ + -DBUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=$PREFIX \ $SRC_DIR ninja install @@ -1173,15 +1173,15 @@

Recipe run: - xtl >=0.7,<0.8 run_constraints: # (3)! - - xsimd >=8.0.3,<10 + - xsimd >=8.0.3,<10 tests: - package_contents: include: # (4)! - xtensor/xarray.hpp files: # (5)! - - share/cmake/xtensor/xtensorConfig.cmake - - share/cmake/xtensor/xtensorConfigVersion.cmake + - ${{ "Library" if win }}/share/cmake/xtensor/xtensorConfig.cmake + - ${{ "Library" if win }}/share/cmake/xtensor/xtensorConfigVersion.cmake about: homepage: https://github.com/xtensor-stack/xtensor @@ -1204,6 +1204,15 @@

Recipe +

CMAKE_ARGS

+

It can be tedious to remember all the diffent variables one needs to pass to CMake to create the perfect build. +The cmake package on conda-forge introduces theCMAKE_ARGS environment variable. +This variable contains the necessary flags to make the package build correctly, also when cross-compiling from one machine to another. +Therefore, it is often not necesary to pass any additional flags to the cmake command. +However, because this is a tutorial we will show how to pass the necessary flags to cmake manually.

+

For more information please refer to the conda-forge documentation.

+

Building A C++ application#

In this example, we'll build poppler, a C++ application for manipulating PDF files from the command line. The final package will install several tools into the bin/ folder.