From bb41ace445e661736168b1fa9be03010fe34fb29 Mon Sep 17 00:00:00 2001 From: dumol Date: Thu, 10 Oct 2024 12:18:28 +0300 Subject: [PATCH] Python 3.12 update. (#58) * Zlib 1.3.1 update. * Pip 24.0 update. * Debug outdated macOS Homebrew hacks. * No more Homebrew hacking on GitHub's macOS setups. * Make it possible to ignore preset Safety IDs. * Make it possible to ignore preset Safety IDs, take two. * Unified debugging for testing on GitHub's bare OS'es. * Update macOS setup for latest GitHub labels. * Shellcheck fixes. * Shellcheck fixes and more. * Put back Homebrew hacks for x86 macOS. * Do not install binary wheels. * Do not install binary wheels, take two. * Also ignore the jinja2 disputed safety issue. * Properly generate SAFETY_IGNORDE_OPTS for multiple Safety IDs to ignore. * Properly generate SAFETY_IGNORDE_OPTS for multiple Safety IDs to ignore, take two. * Try debugging safety opts. * Process Safety IDs the Bash way. * Make it possible to skip tests during build. * Switched to latest OpenSSL 3.0.x on non-Windows platforms. * More updated libs. * Latest versions for remaining Python modules. * Try building Python 3.13.0b2. * Try building Python 3.13.0b2, take two. * Python 3.13 deprecates `crypt` and `spwd`. * Try building Python 3.13.0b2, take three. * Get more Homebrew stuff out of the way. * Try building Python 3.13.0b2, take four. * Get more Homebrew stuff out of the way, take two. * Python builds with OS-provided libedit on macOS lately. * Python builds with OS-provided libedit on macOS lately, take two. * Python builds with OS-provided libedit on macOS lately, take three. * Python builds with OS-provided libedit on macOS lately, take four. * Python 3.13.0b3 update. * Python 3.13.0b3 update, take two. * Removed debugging spillover from package checks. * Removed safety, added requirements.txt. * Removed safety, added requirements.txt, take two. * Get generic Linux deps back to normal. * Get generic Linux deps back to normal, take two. * Install gdb on Amazon Linux 2 for improved Python tests. * Python 3.13.0b4 update. * Python modules update. * Amazon 2 has a gdb unusable for Python tests (built with Python 2). * Automatically regenerate requirements.txt when done building. * Include pip version in requirements.txt. * Fixed Windows package filename. * Try actions/checkout@v4. * Remove download cache instead of just purging. * Synced pythia.sh with server repo version. * Synced pythia.sh from server repo. * Allow building Python release candidate versions. * Also updated pythia.conf from server repo. * Fixed sha512.sum file for embedded Windows package. * Fixed sha512.sum file for embedded Windows package, take two * Try latest 3.12 Python release. * Shellcheck-related fixes. * Fixed Amazon Linux 2 build. * Try building musl package on Alpine 3.15. * Updated OpenSSL to version 3.0.15 on non-Windows platforms. * Updated pip and setuptools to latest versions. * Try latest 3.12 Python release, take two. * Updated OpenSSL version check. * Try auto-updating requirements.txt through the Windows build. * Try auto-updating requirements.txt through the Windows build, take 2. * Try auto-updating requirements.txt through the Windows build, take 3. * Try auto-updating requirements.txt through the Windows build, take 4. * Automated update of requirements.txt from Windows build. * Changes after a quick own review. * Do not copy xz binaries in dist packages. * Compress saved Makefiles. * Compress saved Makefiles, take 2. * Put back tests for crypt and spwd for now. * Also uninstall setuptools before packing the dist. * Automated update of requirements.txt from Windows build. * Use same setuptools version as server repo. * The unzip ersatz in Alpine Linux busybox is good enough. * Try gcc10 for building on Amazon Linux 2 through GHA. * Try gcc10 for building on Amazon Linux 2 through GHA, take 2. * Back to default gcc for building on Amazon Linux 2 through GHA. * Leave setuptools as-is. * Automated update of requirements.txt from Windows build. * Updated Python to version 3.12.7. * One more hack for not building a default Python module. * One more hack for not building a default Python module, take two. * Added a comment for psutil. * Changes after own review. --------- Co-authored-by: dumol --- .github/workflows/bare.yaml | 24 +- .github/workflows/docker.yaml | 12 +- .gitignore | 2 +- build.conf | 29 +- build.sh | 25 +- functions_build.sh | 4 +- os_quirks.sh | 8 +- pkg_checks.sh | 16 +- pythia.conf | 15 +- pythia.sh | 418 +++++++++--------- requirements.txt | 5 + src/Python-Windows/chevahbs | 12 +- src/Python-Windows/sha512.sum | 2 +- src/Python/chevahbs | 13 +- src/Python/sha512.sum | 2 +- .../test_python_binary_dist.py | 12 +- src/openssl/sha512.sum | 2 +- src/xz/chevahbs | 4 +- 18 files changed, 323 insertions(+), 282 deletions(-) create mode 100644 requirements.txt diff --git a/.github/workflows/bare.yaml b/.github/workflows/bare.yaml index 25843066..cc238d46 100644 --- a/.github/workflows/bare.yaml +++ b/.github/workflows/bare.yaml @@ -20,10 +20,14 @@ jobs: runs-on: windows-latest timeout-minutes: 45 + permissions: + # Give the default GITHUB_TOKEN write permission to commit requirements.txt + contents: write + # Steps represent a sequence of tasks that will be executed as part of the job steps: # Checks-out the repository under $GITHUB_WORKSPACE, so the job can access it - - uses: actions/checkout@v2 + - uses: actions/checkout@v4 with: fetch-depth: 2 @@ -36,6 +40,12 @@ jobs: timeout-minutes: 5 run: bash ./build.sh test + # Commit changed requirements.txt back to the repository + - uses: chevah/git-auto-commit-action@HEAD + with: + commit_message: Automated update of requirements.txt from Windows build. + file_pattern: 'requirements.txt' + # To use an RSA key with SFTPPlus, install upstream OpenSSH package, # which is more finicky in regards to file permissions. # Beware the commands in this step run under PowerShell. @@ -84,7 +94,7 @@ jobs: runs-on: macos-13 timeout-minutes: 90 steps: - - uses: actions/checkout@v2 + - uses: actions/checkout@v4 with: fetch-depth: 2 @@ -94,7 +104,10 @@ jobs: run: | sudo mv -v /usr/local/bin/git{,.saved} sudo chmod -v a-r /usr/local/include/libintl.h + sudo chmod -v a-r /usr/local/include/gdbm.h sudo chmod -v a-r /usr/local/opt/gettext/lib/libintl.* + sudo chmod -v a-r /usr/local/opt/gdbm/lib/libgdbm* + sudo chmod -v a-r /usr/local/opt/mpdecimal/lib/libmpdec* - name: Build Pythia timeout-minutes: 30 @@ -103,8 +116,11 @@ jobs: # Fix back Homebrew, for working Shellcheck tests and tmate debugging. - name: Unhack Homebrew run: | - sudo chmod -v a+r /usr/local/opt/gettext/lib/libintl* sudo chmod -v a+r /usr/local/include/libintl.h + sudo chmod -v a-r /usr/local/include/gdbm.h + sudo chmod -v a+r /usr/local/opt/gettext/lib/libintl* + sudo chmod -v a+r /usr/local/opt/gdbm/lib/libgdbm* + sudo chmod -v a+r /usr/local/opt/mpdecimal/lib/libmpdec* sudo mv -v /usr/local/bin/git{.saved,} - name: Test Pythia @@ -133,7 +149,7 @@ jobs: runs-on: macos-latest timeout-minutes: 60 steps: - - uses: actions/checkout@v2 + - uses: actions/checkout@v4 with: fetch-depth: 2 diff --git a/.github/workflows/docker.yaml b/.github/workflows/docker.yaml index 22fc7bef..dd10ccf8 100644 --- a/.github/workflows/docker.yaml +++ b/.github/workflows/docker.yaml @@ -25,8 +25,8 @@ jobs: strategy: fail-fast: false matrix: - # Alpine 3.12 has musl 1.1.24, Amazon 2 has glibc 2.26. - container: [ 'alpine:3.12', 'amazonlinux:2' ] + # Alpine Linux 3.15 has musl 1.2.2, Amazon Linux 2 has glibc 2.26. + container: [ 'alpine:3.15', 'amazonlinux:2' ] timeout-minutes: 60 steps: @@ -44,7 +44,13 @@ jobs: if: startsWith(matrix.container, 'amazonlinux') run: | yum -y upgrade - yum -y install git-core gcc make m4 patch tar unzip perl perl-Test-Simple perl-IPC-Cmd xz gcc-c++ dejagnu + yum -y install git-core gcc make m4 patch tar unzip perl perl-Test-Simple perl-IPC-Cmd xz gcc-c++ dejagnu bzip2 + # To avoid linking against libdb and gdmb libraries on Amazon Linux 2. + # Can't simply uninstall libdb-devel and gdmb-devel, they are required by perl-IPC-Cmd. + rm -v /usr/include/libdb/db.h + rm -v /usr/include/gdbm.h + # This is for avoiding building the NIS module with Python 3.12. + rm -v /usr/include/rpc/rpc.h - name: Ubuntu setup if: startsWith(matrix.container, 'ubuntu') diff --git a/.gitignore b/.gitignore index 5909e34e..1667ec95 100644 --- a/.gitignore +++ b/.gitignore @@ -4,5 +4,5 @@ /src/*/*.tar.gz /src/*/*.tgz /src/*/*.zip -/build-pythia/ +/build-py3/ /cache/ diff --git a/build.conf b/build.conf index 966da428..9ec6ac45 100644 --- a/build.conf +++ b/build.conf @@ -9,8 +9,10 @@ DIST_DIR="dist" # Setting this as 0 disables tests during building (skips chevahbs_test phase). PYTHIA_BUILD_TESTS="${PYTHIA_BUILD_TESTS:-1}" -# Python and lib versions. -PYTHON_BUILD_VERSION="3.11.9" +# Set versions for Python and its libraries. +# For major Python updates, e.g. 3.11->3.12, also update pythia.conf and +# src/Python-Windows/chevahbs (e.g. "python311._pth"->"python312._pth"). +PYTHON_BUILD_VERSION="3.12.7" LIBFFI_VERSION="3.4.6" ZLIB_VERSION="1.3.1" BZIP2_VERSION="1.0.8" @@ -18,13 +20,13 @@ BZIP2_VERSION="1.0.8" # "gpg --keyserver-options auto-key-retrieve --verify xz-*.sig xz-*.gz". XZ_VERSION="5.6.2" # Statically build the BSD libedit on selected platforms to get the -# readline module available without linking to the GPL-only readline libs. +# readline module available without linking to the GPL-only readline libraries. # If there's a need to reenable this, our libedit patch for Python 3.9 was # https://github.com/chevah/pythia/pull/5/commits/09c128154d23feb6b1a7cb5a8d79. # A newer patch is available at https://github.com/python/cpython/issues/57710. LIBEDIT_VERSION="20170329-3.1" -# Our OpenSSL libs are only used for Python's "ssl" module lately. -OPENSSL_VERSION="3.0.14" +# Our OpenSSL libraries are only used for Python's "ssl" module lately. +OPENSSL_VERSION="3.0.15" # Use the version of the "sqlite-autoconf-VERSION.tar.gz" upstream download. # To get its SHA3-256 signature: "openssl dgst -sha3-256 sqlite-autoconf-*". # When updating this, also update the year in src/sqlite/chevahbs, if needed. @@ -36,19 +38,18 @@ SQLITE_VERSION="3460000" BOOTSTRAP_GET_PIP="https://bootstrap.pypa.io/get-pip.py" # Python modules installed after bootstraping pip. -PIP_VERSION="24.0" -SETUPTOOLS_VERSION="70.0.0" +PIP_VERSION="24.2" +SETUPTOOLS_VERSION="70.3.0" # pycparser is explicitly installed to work around setuptools auto dependencies. PYCPARSER_VERSION="2.22" # Python modules that have to be built and/or installed in Pythia. -PSUTIL_VERSION="5.9.8" PYWIN32_VERSION="306" +# To be removed when upstream builds a musl wheel for psutil. +# More at https://github.com/giampaolo/psutil/pull/2126. +PSUTIL_VERSION="6.0.0" -# Pin safety to keep the deps of the test environment under control. -SAFETY_VERSION="3.2.0" - -# Global flags for building required libs. +# Global flags for building required libraries. BUILD_LIBFFI="no" BUILD_ZLIB="no" BUILD_BZIP2="yes" @@ -72,7 +73,3 @@ PIP_ARGS=(\ --index-url="$PIP_INDEX_URL" \ --no-warn-script-location \ ) -# Array of safety IDs to ignore. -# 67599: pip 24.0, https://data.safetycli.com/v/67599/f17/ (disputed). -# 70612: jinja2 3.1.4, https://data.safetycli.com/v/70612/97c (disputed). -SAFETY_IGNORED_IDS=(67599 70612) diff --git a/build.sh b/build.sh index 69646515..864b31bc 100755 --- a/build.sh +++ b/build.sh @@ -8,6 +8,7 @@ set -o nounset # always check if variables exist set -o errexit # always exit on error set -o errtrace # trap errors in functions as well set -o pipefail # don't ignore exit codes when piping output +set -o functrace # inherit DEBUG and RETURN traps # Default PyPI server to use. Can be overwritten in build.conf. PIP_INDEX_URL="https://pypi.org/simple" @@ -210,9 +211,12 @@ command_install_python_modules() { execute "$PYTHON_BIN" -m pip install "${PIP_ARGS[@]}" "$library" done - # When done, uninstall wheel. + echo "# Uninstalling wheel... #" execute "$PYTHON_BIN" -m pip uninstall --yes wheel + echo "# Regenerating requirements.txt file... #" + execute "$PYTHON_BIN" -m pip freeze --all > requirements.txt + echo "::endgroup::" } @@ -222,8 +226,6 @@ help_text_test="Run own tests for the newly-build Python distribution." command_test() { local test_file="test_python_binary_dist.py" local python_binary="$PYTHON_BIN" - local safety_id_to_ignore - declare -a safety_ignore_opts echo "::group::Chevah tests" if [ ! -d "$BUILD_DIR" ]; then @@ -240,25 +242,12 @@ command_test() { execute cp src/chevah-python-tests/get_binaries_deps.sh "$BUILD_DIR" execute pushd "$BUILD_DIR" execute "$python_binary" "$test_file" + execute popd echo "::endgroup::" echo "::group::Security tests" - echo "## Testing for outdated packages and security issues... ##" + echo "## Testing for outdated packages... ##" execute "$python_binary" -m pip list --outdated --format=columns - execute "$python_binary" -m pip install "${PIP_ARGS[@]}" \ - safety=="$SAFETY_VERSION" - - if (( ${#SAFETY_IGNORED_IDS[@]} != 0 )); then - (>&2 echo "Following Safety DB IDs are excepted from checks:") - (>&2 echo -e "\t${SAFETY_IGNORED_IDS[*]}") - for safety_id_to_ignore in "${SAFETY_IGNORED_IDS[@]}"; do - safety_ignore_opts+=("-i $safety_id_to_ignore") - done - fi - - execute "$python_binary" -m safety check --full-report \ - "${safety_ignore_opts[@]}" - execute popd echo "::endgroup::" echo "::group::Shell tests" diff --git a/functions_build.sh b/functions_build.sh index 091e40ef..f3c34fb8 100644 --- a/functions_build.sh +++ b/functions_build.sh @@ -170,7 +170,7 @@ build() { # Put stuff where it's expected and remove some of the cruft. # cleanup_install_dir() { - local python_lib_file="lib$PYTHON_VERSION.a" + local python_lib_file="lib${PYTHON_VERSION}.a" echo "::group::Clean up Python install dir" execute pushd "$BUILD_DIR/$PYTHON_BUILD_DIR" @@ -243,6 +243,8 @@ cleanup_install_dir() { execute mv pkgconfig/* lib/pkgconfig/ execute rmdir pkgconfig fi + # Compress packaged Makefiles to save some space when unpacked. + execute bzip2 lib/config/Makefile* ;; esac # Test that only bin/ and lib/ sub-dirs are left. diff --git a/os_quirks.sh b/os_quirks.sh index bba19557..2e2fe6fe 100644 --- a/os_quirks.sh +++ b/os_quirks.sh @@ -101,9 +101,11 @@ case "$OS" in esac # Use PIC (Position Independent Code) with GCC on 64-bit arches (currently all). -if [ "$CC" = "gcc" ]; then - export CFLAGS="${CFLAGS:-} -fPIC" -fi +case "$CC" in + gcc*) + export CFLAGS="${CFLAGS:-} -fPIC" + ;; +esac # Get number of useful CPUs, to enable parallel builds where applicable. case "$OS" in diff --git a/pkg_checks.sh b/pkg_checks.sh index 80d21685..38026f4a 100644 --- a/pkg_checks.sh +++ b/pkg_checks.sh @@ -15,15 +15,15 @@ # On platforms with multiple C compilers, choose by setting CC in os_quirks.sh. # List of OS packages required for building Python/pyOpenSSL/cryptography etc. -BASE_PKGS="gcc make m4 patch unzip perl" +BASE_PKGS="gcc make m4 patch perl" if [ "$BUILD_LIBEDIT" = "yes" ]; then BASE_PKGS="$BASE_PKGS automake libtool" fi -APK_PKGS="$BASE_PKGS git curl bash musl-dev linux-headers lddtree shadow \ - openssh-client file unzip g++ musl-locales dejagnu" -DEB_PKGS="$BASE_PKGS tar diffutils git curl \ +APK_PKGS="$BASE_PKGS git curl bash musl-dev linux-headers lddtree \ + openssh-client file g++ musl-locales dejagnu" +DEB_PKGS="$BASE_PKGS unzip tar diffutils git curl \ openssh-client libtest-simple-perl xz-utils g++ dejagnu" -RPM_PKGS="$BASE_PKGS tar diffutils git-core curl \ +RPM_PKGS="$BASE_PKGS bzip2 unzip tar diffutils git-core curl \ openssh-clients perl-Test-Simple perl-IPC-Cmd xz gcc-c++ dejagnu" # Check for OS packages required for the build. @@ -33,7 +33,7 @@ PACKAGES="$CC make m4 git patch curl sha512sum tar unzip" # This is defined as an array of commands and opts, to allow it to be quoted. CHECK_CMD=(command -v) -# $CHECK_CMD should exit with 0 only when checked packages is installed. +# $CHECK_CMD should exit with 0 only when checked package is installed. case "$OS" in windows) # Nothing to actually build on Windows. @@ -54,7 +54,7 @@ case "$OS" in ;; linux*) if [ -x /sbin/apk ]; then - # Assumes Alpine Linux 3.12. + # Assumes Alpine Linux 3.15. CHECK_CMD=(apk info -q -e) PACKAGES="$APK_PKGS" elif [ -x /usr/bin/dpkg ]; then @@ -74,7 +74,7 @@ esac # External checks with various exit codes are checked below. set +o errexit -# If $CHECK_CMD is still (command -v), it's only a check for needed commands. +# If $CHECK_CMD is still "(command -v)", it's only a check for needed commands. if [ -n "$PACKAGES" ]; then for package in $PACKAGES ; do echo "Checking if $package is available..." diff --git a/pythia.conf b/pythia.conf index 2446c430..e78a1d03 100644 --- a/pythia.conf +++ b/pythia.conf @@ -1,11 +1,14 @@ -PYTHON_CONFIGURATION="default@3.11.3.f9d9434" +# When building a new major Python version, e.g. 3.11->3.12, +# update this in advance (e.g. use "default@3.12.0.deadbeef"), +# and remove BUILD_ENV_* files (e.g. with `./build.sh clean -a`). +PYTHON_CONFIGURATION="default@3.12.7.ac6595f" # This is defined as a Bash array of options to be passed to commands. -BASE_REQUIREMENTS=("chevah-brink==1.0.13" "paver==1.3.4") +BASE_REQUIREMENTS=("chevah-brink==1.0.15" "paver==1.3.4") +# Use our private PyPi server instead of the default one set in pythia.sh. +PIP_INDEX_URL="https://bin.chevah.com:20443/pypi/simple" # Use our production server instead of the GitHub releases set by default. BINARY_DIST_URI="https://bin.chevah.com:20443/production" # For testing packages, make sure this one is the last uncommented instance: -#BINARY_DIST_URI="https://bin.chevah.com:20443/testing" -# Also overwrite the default pypi.org site set by default in pythia.sh. -PIP_INDEX_URL="https://bin.chevah.com:20443/pypi/simple" -# This is used by the Python runtime. +BINARY_DIST_URI="https://bin.chevah.com:20443/testing" +# This directory is used by the Python runtime. CHEVAH_BUILD_DIR="build-py3" diff --git a/pythia.sh b/pythia.sh index c833a9ba..cfca7860 100755 --- a/pythia.sh +++ b/pythia.sh @@ -40,6 +40,7 @@ set -o nounset # always check if variables exist set -o errexit # always exit on error set -o errtrace # trap errors in functions as well set -o pipefail # don't ignore exit codes when piping output +set -o functrace # inherit DEBUG and RETURN traps # Initialize default values. COMMAND="${1-''}" @@ -61,6 +62,7 @@ export PATH="$PATH:/sbin:/usr/sbin:/usr/local/bin" # WAS_PYTHON_JUST_INSTALLED=0 DIST_FOLDER="dist" +PYTHIA_VERSION_FILE="lib/PYTHIA_VERSION" # Path global variables. @@ -76,7 +78,6 @@ CACHE_FOLDER="" PYTHON_BIN="" PYTHON_LIB="" -LOCAL_PYTHON_BINARY_DIST="" # Put default values and create them as global variables. OS="not-detected-yet" @@ -107,7 +108,7 @@ check_source_folder() { # Called to trigger the entry point in the virtual environment. # Can be overwritten in pythia.conf execute_venv() { - "$PYTHON_BIN" -c "from paver.tasks import main; main()" "$@" + "$PYTHON_BIN" -X utf8 -c "from paver.tasks import main; main()" "$@" } @@ -143,21 +144,24 @@ source pythia.conf clean_build() { # Shortcut for clear since otherwise it will depend on python - echo "Removing $BUILD_FOLDER..." - delete_folder "$BUILD_FOLDER" - echo "Removing $DIST_FOLDER..." - delete_folder "$DIST_FOLDER" + echo "Removing $BUILD_FOLDER ..." + quick_rm "$BUILD_FOLDER" + echo "Removing $DIST_FOLDER ..." + quick_rm "$DIST_FOLDER" echo "Removing publish/..." - delete_folder publish/ + quick_rm publish/ + echo "Removing node_modules/ ..." + quick_rm node_modules/ # In some case pip hangs with a build folder in temp and # will not continue until it is manually removed. # On the OSX build server tmp is in $TMPDIR + echo "Removing pip* sub-dirs in temp dir..." if [ -n "${TMPDIR-}" ]; then # check if TMPDIR is set before trying to clean it. - rm -rf "$TMPDIR"/pip* + quick_rm "$TMPDIR"/pip* else - rm -rf /tmp/pip* + quick_rm /tmp/pip* fi } @@ -171,31 +175,32 @@ _clean_pyc() { # -# Removes the download/pip cache entries. Must be called before +# Removes the download/pip cache folder. Must be called before # building/generating the distribution. # purge_cache() { clean_build - echo "Cleaning download cache ..." - rm -rf "${CACHE_FOLDER:?}"/* + echo "Removing the cache folder..." + quick_rm "$CACHE_FOLDER" } # -# Delete the folder as quickly as possible. +# Deletes a list of files and folders as quick as possible. # -delete_folder() { - local target="$1" +quick_rm() { # On Windows, we use internal command prompt for maximum speed. # See: https://stackoverflow.com/a/6208144/539264 if [ "$OS" = "windows" ]; then - if [ -d "$target" ]; then + for target in "$@"; do cmd //c "del /f/s/q $target > nul" - cmd //c "rmdir /s/q $target" - fi + if [ -d "$target" ]; then + cmd //c "rmdir /s/q $target" + fi + done else - rm -rf "$target" + execute rm -rf "$@" fi } @@ -219,6 +224,24 @@ execute() { fi } + +# +# Checks that the final CACHE_FOLDER is an absolute path. +# Creates it if not existing. +# Works with "C:\" style paths on Windows. +# Assumes current dir is not the top one. +# +check_cache_path() { + execute mkdir -p "$CACHE_FOLDER" + + cd .. + if [ ! -d "$CACHE_FOLDER" ]; then + (>&2 echo "CACHE_PATH \"$CACHE_FOLDER\" is not an absolute path!") + exit 9 + fi + cd - +} + # # Update global variables with current paths. # @@ -234,10 +257,9 @@ update_path_variables() { fi # Read first from env var. - set +o nounset - BUILD_FOLDER="$CHEVAH_BUILD" - CACHE_FOLDER="$CHEVAH_CACHE" - set -o nounset + BUILD_FOLDER="${CHEVAH_BUILD:-}" + # Use CHEVAH_CACHE by default. + CACHE_FOLDER="${CHEVAH_CACHE:-}" if [ -z "$BUILD_FOLDER" ] ; then # Use value from configuration file. @@ -250,20 +272,21 @@ update_path_variables() { fi if [ -z "$CACHE_FOLDER" ] ; then - # Use default if not yet defined. + # Use CHEVAH_CACHE_DIR as a second option. CACHE_FOLDER="$CHEVAH_CACHE_DIR" fi if [ -z "$CACHE_FOLDER" ] ; then - # Use default if not yet defined. - CACHE_FOLDER="cache" + # Use "cache/" in current dir if still not defined. + # Full path to it is needed when unpacking in build folder. + CACHE_FOLDER="$(pwd)/cache" fi + check_cache_path + PYTHON_BIN="$BUILD_FOLDER/$PYTHON_BIN" PYTHON_LIB="$BUILD_FOLDER/$PYTHON_LIB" - LOCAL_PYTHON_BINARY_DIST="$PYTHON_NAME-$OS-$ARCH" - export PYTHONPATH="$BUILD_FOLDER" export CHEVAH_PYTHON="$PYTHON_NAME" export CHEVAH_OS="$OS" @@ -311,12 +334,12 @@ resolve_python_version() { # Install base package. # install_base_deps() { - echo "::groups::Installing base requirements:" "${BASE_REQUIREMENTS[@]}" + echo "::group::Installing base requirements:" "${BASE_REQUIREMENTS[@]}" set +e # There is a bug in pip/setuptools when using custom build folders. # See https://github.com/pypa/pip/issues/3564 - rm -rf "${BUILD_FOLDER:?}"/pip-build + quick_rm "$BUILD_FOLDER"/pip-build "$PYTHON_BIN" -m \ pip install \ --index-url="$PIP_INDEX_URL" \ @@ -338,182 +361,156 @@ install_base_deps() { # set_download_commands() { set +o errexit - if command -v curl > /dev/null; then - # Options not used because of no support in older curl versions: - # --retry-connrefused (since curl 7.52.0) - # --retry-all-errors (since curl 7.71.0) - # Retry 2 times, allocating 10s for the connection phase, - # at most 300s for an attempt, sleeping for 5s between retries. - # Strings wouldn't work when quoted, using Bash arrays instead. - CURL_RETRY_OPTS=(\ - --retry 2 \ - --connect-timeout 10 \ - --max-time 300 \ - --retry-delay 5 \ - ) - DOWNLOAD_CMD=(curl --remote-name --location "${CURL_RETRY_OPTS[@]}") - ONLINETEST_CMD=(curl --fail --silent --head "${CURL_RETRY_OPTS[@]}" \ - --output /dev/null) - set -o errexit - return + if ! command -v curl > /dev/null; then + (>&2 echo "Missing curl! It's needed for downloading Python package.") + exit 3 fi - (>&2 echo "Missing curl! It is needed for downloading the Python package.") - exit 3 + set -o errexit + + # Options not used because of no support in older curl versions: + # --retry-connrefused (since curl 7.52.0) + # --retry-all-errors (since curl 7.71.0) + # Retry 2 times, allocating 10s for the connection phase, + # at most 300s for an attempt, sleeping for 5s between retries. + CURL_RETRY_OPTS=(\ + --retry 2 \ + --connect-timeout 10 \ + --max-time 300 \ + --retry-delay 5 \ + ) + CURL_CMD=(curl --location "${CURL_RETRY_OPTS[@]}") + ONLINETEST_CMD=(curl --fail --silent --head "${CURL_RETRY_OPTS[@]}" \ + --output /dev/null) } # -# Download and extract a binary distribution. +# Put the Pythia package in the cache folder if missing. +# Requires as parameter the filename of the package to download. # -get_binary_dist() { - local dist_name="$1" - local remote_base_url="$2" - - echo "Getting $dist_name from $remote_base_url..." +get_pythia_package() { + local python_pkg_file="$1" + local package_url="$BINARY_DIST_URI/$PYTHON_VERSION/$python_pkg_file" + local cached_pkg_file="$CACHE_FOLDER/$python_pkg_file" + local random_id="${HOSTNAME:=nohostname}-${BUILD_ID:=nobuild}-$RANDOM" + local curl_tmpdir="/tmp" + if [ -n "${TMPDIR-}" ]; then + local curl_tmpdir="$TMPDIR" + fi + local curl_tmpfile="$curl_tmpdir"/"$python_pkg_file"."$random_id".tmp - tar_gz_file="$dist_name".tar.gz - tar_file="$dist_name".tar + if [ -e "$cached_pkg_file" ]; then + # To optimize output, only the package filename is shown here. + # Full path of the cached package file is shown when unpacking. + echo "Found \"$python_pkg_file\" in cache. Updating its timestamp ..." + execute date +%s > "$cached_pkg_file".last-access + return + fi - mkdir -p "$CACHE_FOLDER" - pushd "$CACHE_FOLDER" + echo "File \"$python_pkg_file\" not found in cache, getting it from:" + echo "$package_url" - # Get and extract archive. - rm -rf "$dist_name" - rm -f "$tar_gz_file" - rm -f "$tar_file" - execute "${DOWNLOAD_CMD[@]}" "$remote_base_url"/"$tar_gz_file" - execute gunzip -f "$tar_gz_file" - execute tar -xf "$tar_file" - rm -f "$tar_gz_file" - rm -f "$tar_file" + set +o errexit + if ! "${ONLINETEST_CMD[@]}" "$package_url"; then + (>&2 echo "Couldn't find package on remote server!") + exit 4 + fi + set -o errexit - popd + # Requested Pythia package is available online, get it. + execute "${CURL_CMD[@]}" "$package_url" --output "$curl_tmpfile" + execute mv "$curl_tmpfile" "$cached_pkg_file" + execute date +%s > "$cached_pkg_file".last-access } # -# Check if we have a versioned Python distribution. +# Unpack TAR.GZ file given as parameter 2 in sub-dir given as parameter 1. +# Remaining parameters must be long opts to append to the tar command. +# If failing to unpack it, remove the cached TAR.GZ file. # -test_version_exists() { - local remote_base_url="$1" - local target_file="python-$PYTHON_VERSION-$OS-$ARCH.tar.gz" +unpack_in_dir(){ + local dir="$1"; shift + local pkg="$1"; shift + local long_tar_opts=( "$@" ) + + if [ "$OS" = "windows" ]; then + # GNU tar considers Windows paths containing ":" as network paths. + long_tar_opts+=( --force-local ) + elif [ "$OS" = "macos" ]; then + # The old Bash on macOS requires to have at least one element. + long_tar_opts+=( --no-mac-metadata ) + fi - echo "Checking $remote_base_url/$PYTHON_VERSION/$target_file..." - "${ONLINETEST_CMD[@]}" "$remote_base_url"/"$PYTHON_VERSION"/"$target_file" - return $? + echo "Unpacking in \"$dir\" the archive \"$pkg\" ..." + execute mkdir -p "$dir" + pushd "$dir" + set +o errexit + if ! tar xfz "$pkg" "${long_tar_opts[@]}"; then + (>&2 echo "Couldn't unpack! Removing ${pkg}* ...") + quick_rm "$pkg"* + # Unpacking an incomplete archive may leave files on disk. + if [ -n "${TARGET_OS-}" ]; then + quick_rm "$PYTHON_NAME-$TARGET_OS-$TARGET_ARCH" + else + quick_rm "$PYTHON_NAME-$OS-$ARCH" + fi + exit 6 + fi + set -o errexit + popd } # -# Download and extract in cache the python distributable. +# Check if a Python distribution with desired version is present in a folder. +# If yes, returns successfully. +# If not, removes the folder to check. +# Requires as parameter the target folder to check. # -get_python_dist() { - local remote_base_url="$1" - local python_distributable="python-$PYTHON_VERSION-$OS-$ARCH" - local onlinetest_errorcode +check_python_version() { + local python_dist_dir="$1" + local python_installed_version + local version_file="$python_dist_dir"/"$PYTHIA_VERSION_FILE" - set +o errexit - test_version_exists "$remote_base_url" - onlinetest_errorcode="$?" - set -o errexit + if [ ! -d "$python_dist_dir" ]; then + echo "Missing \"$python_dist_dir\" folder. Rebuilding ..." + # Make sure there's no file with the same name. + quick_rm "$python_dist_dir" + return 1 + fi - if [ $onlinetest_errorcode -eq 0 ]; then - # We have the requested python version. - get_binary_dist "$python_distributable" \ - "$remote_base_url"/"$PYTHON_VERSION" - else - (>&2 echo "Couldn't find package on remote server. Full link:") - echo "$remote_base_url/$PYTHON_VERSION/$python_distributable.tar.gz" - exit 4 + # This fails graciously (but with a visible error) if the file is missing. + python_installed_version="$(cut -d- -f1 < "$version_file")" + if [ "$python_installed_version" = "$PYTHON_VERSION" ]; then + return 0 fi -} + # These messages are aligned on purpose. + echo "Found previous Python version: $python_installed_version" + echo "New Python version to install: $PYTHON_VERSION" + echo "Removing $python_dist_dir ..." + quick_rm "$python_dist_dir" + return 2 +} -# copy_python can be called in a recursive way, and this is here to prevent -# accidental infinite loops. -COPY_PYTHON_RECURSIONS=0 # -# Copy python to build folder from binary distribution. +# Initiate build folder with extracted Pythia if required version not found. # -copy_python() { - local python_distributable="$CACHE_FOLDER/$LOCAL_PYTHON_BINARY_DIST" - local python_installed_version - - COPY_PYTHON_RECURSIONS="$((COPY_PYTHON_RECURSIONS + 1))" - - if [ "$COPY_PYTHON_RECURSIONS" -gt 2 ]; then - (>&2 echo "Too many calls to copy_python: $COPY_PYTHON_RECURSIONS") - exit 5 - fi - - # Check that python dist was installed - if [ ! -s "$PYTHON_BIN" ]; then - # We don't have a Python binary, so we install it since everything - # else depends on it. - echo "::group::Get Python" - echo "Bootstrapping $LOCAL_PYTHON_BINARY_DIST environment" \ - "to $BUILD_FOLDER..." - mkdir -p "$BUILD_FOLDER" - - if [ -d "$python_distributable" ]; then - # We have a cached distributable. - # Check if is at the right version. - local cache_ver_file - cache_ver_file="$python_distributable"/lib/PYTHIA_VERSION - cache_version="UNVERSIONED" - if [ -f "$cache_ver_file" ]; then - cache_version="$(cut -d"-" -f1 < "$cache_ver_file")" - fi - if [ "$PYTHON_VERSION" != "$cache_version" ]; then - # We have a different version in the cache. - # Just remove it and hope that the next step will download - # the right one. - rm -rf "$python_distributable" - fi - fi - - if [ ! -d "$python_distributable" ]; then - # We don't have a cached python distributable. - echo "No $LOCAL_PYTHON_BINARY_DIST environment." \ - "Start downloading it..." - get_python_dist "$BINARY_DIST_URI" - fi - - echo "Copying Python distribution files... " - cp -R "$python_distributable"/* "$BUILD_FOLDER" +bootstrap_build_folder() { + local extra_tar_opts=( --strip-components 1 ) - echo "::endgroup::" + # Return if set Python version is already present. + check_python_version "$BUILD_FOLDER" && return - install_base_deps - WAS_PYTHON_JUST_INSTALLED=1 - else - # We have a Python, but we are not sure if is the right version. - local version_file="$BUILD_FOLDER"/lib/PYTHIA_VERSION - - # If we are upgrading the cache from Python 2, - # This file is required, so we create it if non-existing. - touch "$version_file" - python_installed_version="$(cut -d"-" -f1 < "$version_file")" - if [ "$PYTHON_VERSION" != "$python_installed_version" ]; then - # We have a different python installed. - # Check if we have the to-be-updated version and fail if - # it does not exists. - set +o errexit - test_version_exists "$BINARY_DIST_URI" - local test_version="$?" - set -o errexit - if [ $test_version -ne 0 ]; then - (>&2 echo "The build is now at $python_installed_version.") - (>&2 echo "Failed to find the required $PYTHON_VERSION.") - (>&2 echo "Check your configuration or the remote server.") - exit 6 - fi + echo "::group::Get Python" + echo "Bootstrapping \"$PYTHON_NAME-$OS-$ARCH\" in \"$BUILD_FOLDER\" ..." + execute mkdir -p "$BUILD_FOLDER" + get_pythia_package "python-$PYTHON_VERSION-$OS-$ARCH.tar.gz" + unpack_in_dir "$BUILD_FOLDER" \ + "$CACHE_FOLDER"/"python-$PYTHON_VERSION-$OS-$ARCH.tar.gz" \ + "${extra_tar_opts[@]}" + echo "::endgroup::" - # Remove it and try to install it again. - echo "Updating Python from" \ - "$python_installed_version to $PYTHON_VERSION" - rm -rf "${BUILD_FOLDER:?}"/* - rm -rf "$python_distributable" - copy_python - fi - fi + install_base_deps + WAS_PYTHON_JUST_INSTALLED=1 } @@ -648,9 +645,6 @@ check_glibc_version(){ # beware we haven't normalized arch names yet. supported_glibc2_version=26 - echo "No specific runtime for the current distribution / version / arch." - echo "Minimum glibc version for this arch: 2.$supported_glibc2_version." - # Tested with glibc 2.5/2.11.3/2.12/2.23/2.28-35 and eglibc 2.13/2.19. glibc_version="$(head -n 1 "$ldd_output_file" | rev | cut -d" " -f1 | rev)" rm "$ldd_output_file" @@ -670,10 +664,9 @@ check_glibc_version(){ # Decrement supported_glibc2_version above if building against older glibc. if [ "${glibc_version_array[1]}" -lt "$supported_glibc2_version" ]; then + echo "Minimum glibc version for this arch: 2.$supported_glibc2_version." (>&2 echo "NOT good. Detected version is older: $glibc_version!") exit 22 - else - echo "All is good. Detected glibc version: $glibc_version." fi # Supported glibc version detected, set $OS for a generic glibc Linux build. @@ -682,37 +675,37 @@ check_glibc_version(){ check_musl_version(){ local musl_version + local musl_version_cleaned local musl_version_array local musl_version_unsupported="false" - local supported_musl11_version=24 + local supported_musl12_version=2 - echo "No specific runtime for the current distribution / version / arch." - echo "Minimum musl version for this arch: 1.1.$supported_musl11_version." - - # Tested with musl 1.1.24/1.2.2/1.2.5/1.2.5_git20240512. + # Tested with musl 1.1.24/1.2.2/1.2.4_git20230717/1.2.5. musl_version="$(grep -E ^"Version" "$ldd_output_file" | cut -d" " -f2)" rm "$ldd_output_file" - # Bleeding edge distributions might use git versions. - musl_version="${musl_version//_git/.}" + # Some Alpine Linux releases (e.g. Alpine 3.19) use git-versioned musl. + musl_version_cleaned="${musl_version//_git/.}" - if [[ "$musl_version" =~ [^[:digit:]\.] ]]; then - (>&2 echo "Musl version should only have digits and dots, but:") - (>&2 echo " \$musl_version=$musl_version") + if [[ "$musl_version_cleaned" =~ [^[:digit:]\.] ]]; then + (>&2 echo "Cleaned musl version should only have digits and dots, but") + (>&2 echo " \$musl_version_cleaned=$musl_version_cleaned") exit 25 fi - IFS=. read -r -a musl_version_array <<< "$musl_version" + IFS=. read -r -a musl_version_array <<< "$musl_version_cleaned" - # Decrement supported_musl11_version above if building against older musl. + # Decrement supported_musl12_version above if building against older musl. if [ "${musl_version_array[0]}" -lt 1 ]; then musl_version_unsupported="true" - elif [ "${musl_version_array[0]}" -eq 1 ]; then + elif [ "${musl_version_array[0]}" -eq 1 ]; then if [ "${musl_version_array[1]}" -lt 1 ];then musl_version_unsupported="true" elif [ "${musl_version_array[1]}" -eq 1 ];then - if [ "${musl_version_array[2]}" -lt "$supported_musl11_version" ] + if [ "${musl_version_array[2]}" -lt "$supported_musl12_version" ] then + echo -n "Minimum musl version for this arch: 1.2." + echo "$supported_musl12_version." (>&2 echo "NOT good. Detected version is older: $musl_version!") exit 27 fi @@ -720,12 +713,10 @@ check_musl_version(){ fi if [ "$musl_version_unsupported" = "true" ]; then - (>&2 echo "Only musl 1.1 or greater supported! Detected: $musl_version") + (>&2 echo "Only musl 1.2 or greater supported! Detected: $musl_version") exit 26 fi - echo "All is good. Detected musl version: $musl_version." - # Supported musl version detected, set $OS for a generic musl Linux build. OS="linux_musl" } @@ -826,16 +817,33 @@ if [ "$COMMAND" = "detect_os" ]; then exit 0 fi +# +# This is here to help generate distributables for other platforms. +# It creates a fresh Pythia Python virtual environment for the +# required platform in a specified destination path. if [ "$COMMAND" = "get_python" ] ; then - OS="$2" - ARCH="$3" + TARGET_OS="$2" + TARGET_ARCH="$3" + TARGET_PATH="$4" resolve_python_version - get_python_dist "$BINARY_DIST_URI" - exit 0 + + # Make sure the target path is clean even when executed directly. + quick_rm "$TARGET_PATH" + + echo -n "Initializing a \"$PYTHON_NAME-$TARGET_OS-$TARGET_ARCH\" " + echo "distribution in \"$TARGET_PATH\" ..." + # Get the needed tar.gz file if not found in cache. + get_pythia_package "python-$PYTHON_VERSION-$TARGET_OS-$TARGET_ARCH.tar.gz" + + # Unpack the Python distribution in the cache folder. + unpack_in_dir "$TARGET_PATH" \ + "$CACHE_FOLDER"/"python-$PYTHON_VERSION-$TARGET_OS-$TARGET_ARCH.tar.gz"\ + --strip-components 1 + exit fi check_source_folder -copy_python +bootstrap_build_folder install_dependencies # Update pythia.conf dependencies when running deps. diff --git a/requirements.txt b/requirements.txt new file mode 100644 index 00000000..98f3ce6c --- /dev/null +++ b/requirements.txt @@ -0,0 +1,5 @@ +pip==24.2 +psutil==6.0.0 +pycparser==2.22 +pywin32==306 +setuptools==70.3.0 diff --git a/src/Python-Windows/chevahbs b/src/Python-Windows/chevahbs index fae717c5..6cbc7609 100755 --- a/src/Python-Windows/chevahbs +++ b/src/Python-Windows/chevahbs @@ -20,7 +20,13 @@ INSTALL_DIR="$3"/lib # Construct the relevant download link for the Windows version. EXT="amd64.zip" ARCHIVE="python-$VER.$EXT" -LINK="https://www.python.org/ftp/python/$VER/python-$VER-embed-$EXT" + +# For beta versions, remove everything after the b for the dir name. +DIR=${VER%%b*} +# For rc versions, remove everything after rc for the dir name +DIR=${VER%%rc*} + +LINK="https://www.python.org/ftp/python/$DIR/python-$VER-embed-$EXT" # Download without unpacking (amd64.zip extension is treated specially). download_sources "$NAME" "$VER" "$LINK" "$EXT" @@ -30,5 +36,5 @@ echo "## Extracting ZIP archive to $INSTALL_DIR... ##" execute unzip -q "$ARCHIVE" -d "$INSTALL_DIR" echo "## Adding site-packages to python*._pth file... ##" -echo "import site" >> "$INSTALL_DIR"/python311._pth -echo "site.main()" >> "$INSTALL_DIR"/python311._pth +echo "import site" >> "$INSTALL_DIR"/python312._pth +echo "site.main()" >> "$INSTALL_DIR"/python312._pth diff --git a/src/Python-Windows/sha512.sum b/src/Python-Windows/sha512.sum index 387f8a07..aa3aa085 100644 --- a/src/Python-Windows/sha512.sum +++ b/src/Python-Windows/sha512.sum @@ -1 +1 @@ -24106fc0e2a798c34840222083f2643c95a0e14c1eb8bb5489dfbcc3eb56d631cc0649c406d75eae173e2d03cd1cf88d246e5903abf62d414fab88e94afa176e python-3.11.9.amd64.zip +2f67a8487a9edece26b73aab27e75249e538938ad976d371a9411b54dbae20afeac82b406ad4eeee38b1cf6f407e7620679d30c0fff82ec8e8ae62268c322d59 python-3.12.7.amd64.zip diff --git a/src/Python/chevahbs b/src/Python/chevahbs index 4c1e7dd8..b73c7551 100755 --- a/src/Python/chevahbs +++ b/src/Python/chevahbs @@ -15,17 +15,20 @@ chevahbs_getsources() { local name="$1" local ver="$2" local ext="tgz" - local link=https://www.python.org/ftp/python/"$ver"/"$name"-"$ver"."$ext" + local dir + local link + + # For beta versions, remove everything after the b for the dir name. + dir=${ver%%b*} + # For rc versions, remove everything after rc for the dir name. + dir=${ver%%rc*} + link=https://www.python.org/ftp/python/"$dir"/"$name"-"$ver"."$ext" download_sources "$name" "$ver" "$link" "$ext" } chevahbs_patch() { - # Our own patch to avoid compiling certain modules. - echo "# Applying disabled_modules.patch... #" - execute patch < disabled_modules.patch - if [ "$BUILD_LIBEDIT" = "yes" ]; then # Adapted from https://bugs.python.org/issue13501. echo "# Applying readline_libedit.patch... #" diff --git a/src/Python/sha512.sum b/src/Python/sha512.sum index 398fd287..ce610699 100644 --- a/src/Python/sha512.sum +++ b/src/Python/sha512.sum @@ -1 +1 @@ -33651f373fcc6da9aac895da25300de0f807e3643daff99e8c81e6205d75c67a060d7fd7bbee5ce8b5e142aa2f407284d90b182b5fffd9befc54e6b55b7c08f7 Python-3.11.9.tgz +794128ef8a8129915a9a004e87d747d2610df0c24d201c17f43d72ce5a71fde6953df1985d3d883d6e751db2dd994be5e4ba8148eb2ce1fb5c12de78b6b48842 Python-3.12.7.tgz diff --git a/src/chevah-python-tests/test_python_binary_dist.py b/src/chevah-python-tests/test_python_binary_dist.py index f72e82ec..aa41128e 100644 --- a/src/chevah-python-tests/test_python_binary_dist.py +++ b/src/chevah-python-tests/test_python_binary_dist.py @@ -34,7 +34,7 @@ def get_allowed_deps(): elif 'linux' in CHEVAH_OS: # Deps without paths for generic glibc Linux builds. # Only glibc 2.x libs are allowed. - # Tested on Amazon 2 & Ubuntu 16.04/18.04 with glibc 2.26/2.23/2.27. + # Tested on Amazon 2 & Ubuntu 20.04/22.04 with glibc 2.26/2.31/2.35. allowed_deps=[ 'libc.so.6', 'libcrypt.so.1', @@ -139,9 +139,11 @@ def get_allowed_deps(): '/System/Library/Frameworks/Security.framework/Versions/A/Security', '/System/Library/Frameworks/SystemConfiguration.framework/Versions/A/SystemConfiguration', '/usr/lib/libbz2.1.0.dylib', + '/usr/lib/libedit.3.dylib', '/usr/lib/libffi.dylib', '/usr/lib/libiconv.2.dylib', '/usr/lib/libncurses.5.4.dylib', + '/usr/lib/libpanel.5.4.dylib', '/usr/lib/libresolv.9.dylib', '/usr/lib/libSystem.B.dylib', '/usr/lib/libz.1.dylib', @@ -336,15 +338,15 @@ def main(): else: print('zlib %s' % (zlib.ZLIB_VERSION,)) + # Check OpenSSL version to prevent linking to OS libs. + # On Windows, this version is what upstream embedded with Python. try: from ssl import OPENSSL_VERSION as current_openssl_version import _hashlib exit_code = egg_check(_hashlib) | exit_code - # Check OpenSSL version to prevent linking to OS libs. - expecting_openssl_version = u'OpenSSL 3.0.14 4 Jun 2024' + expecting_openssl_version = u'OpenSSL 3.0.15 3 Sep 2024' if CHEVAH_OS == "windows": - # The upstream Windows packages embed their own OpenSSL libs. - expecting_openssl_version = u'OpenSSL 3.0.13 30 Jan 2024' + expecting_openssl_version = u'OpenSSL 3.0.15 3 Sep 2024' if current_openssl_version != expecting_openssl_version: sys.stderr.write('Expecting %s, got %s.\n' % ( expecting_openssl_version, current_openssl_version)) diff --git a/src/openssl/sha512.sum b/src/openssl/sha512.sum index 6334693b..15b780fa 100644 --- a/src/openssl/sha512.sum +++ b/src/openssl/sha512.sum @@ -1 +1 @@ -1c59c01e60da902a20780d71f1fa5055d4037f38c4bc3fb27ed5b91f211b36a6018055409441ad4df58b5e9232b2528240d02067272c3c9ccb8c221449ca9ac0 openssl-3.0.14.tar.gz +acd80f2f7924d90c1416946a5c61eff461926ad60f4821bb6b08845ea18f8452fd5e88a2c2c5bd0d7590a792cb8341a3f3be042fd0a5b6c9c1b84a497c347bbf openssl-3.0.15.tar.gz diff --git a/src/xz/chevahbs b/src/xz/chevahbs index 8001eae1..4748d0db 100755 --- a/src/xz/chevahbs +++ b/src/xz/chevahbs @@ -37,7 +37,9 @@ chevahbs_try() { chevahbs_cp() { - execute "${MAKE_CMD[@]}" install DESTDIR="$INSTALL_DIR" + # xz's installation copies binaries too, but there's no need for them. + execute cp -R src/liblzma/api/lzma* "$INSTALL_DIR"/include/ + execute cp src/liblzma/.libs/liblzma.a "$INSTALL_DIR"/lib/ }