forked from apache/arrow
-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
apacheGH-30622: [GAC-11] Put Timestream ODBC driver into Arrow repository without code style fixes #7
Merged
alinaliBQ
merged 14 commits into
GAC-odbc-driver
from
alinaliBQ/GAC-11/timestream-odbc-no-code-lint-fixes
Nov 28, 2023
Merged
apacheGH-30622: [GAC-11] Put Timestream ODBC driver into Arrow repository without code style fixes #7
Changes from 13 commits
Commits
Show all changes
14 commits
Select commit
Hold shift + click to select a range
0a9ea48
Initial commit
amazon-auto c7745ed
code uploading
RoyZhang2022 0d96e36
fix cppcheck failure (#1)
RoyZhang2022 75fd00f
Fix wrong linux package name and remove unneeded files (#2)
RoyZhang2022 63b510a
Fix floating intergration test issue, document update and github auto…
RoyZhang2022 c87c4c8
fix bracket missing (#5)
RoyZhang2022 38b4684
Pass driver info to Timestream using user agent (#6)
RoyZhang2022 33d3cb0
Fixing awslabs links. (#8)
forestmvey e93f8f8
Move AWS TS ODBC Driver under flight sql subdirectory
alinaliBQ 79efb7f
Enable Windows builds, workflow files are based on Amazon Timestream …
alinaliBQ ad079e6
Fix CMAKE code formats
alinaliBQ c57f7b0
Fix python code doc errors
alinaliBQ 1e351d3
Enable linux build, the file is copied from Amazon Timestream ODBC seed
alinaliBQ 4ac9ff1
Enable macOS builds, workflow files are copied from Amazon TS ODBC seed
alinaliBQ File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,280 @@ | ||
name: ODBC Driver for Linux | ||
|
||
on: | ||
push: | ||
paths: | ||
- '.github/workflows/win-build.yml' | ||
- 'cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/**' | ||
pull_request: | ||
paths: | ||
- '.github/workflows/win-build.yml' | ||
- 'cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/**' | ||
|
||
env: | ||
CI_OUTPUT_PATH: "ci-output" | ||
ODBC_LIB_PATH: "${{github.workspace}}/cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/build/odbc/lib" | ||
ODBC_BIN_PATH: "${{github.workspace}}/cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/build/odbc/bin" | ||
ODBC_BUILD_PATH: "${{github.workspace}}/cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/build/odbc/build" | ||
TIMESTREAM_LOG_PATH: "${{github.workspace}}/cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/build/odbc/logs" | ||
TIMESTREAM_LOG_LEVEL: "4" | ||
BIG_TABLE_PAGINATION_TEST_ENABLE: "TRUE" | ||
|
||
# # AAD Test environment variables. Uncomment out to use GitHub secrets to enable AAD integration tests | ||
# ENABLE_AAD_TEST: "TRUE" | ||
# AAD_APP_ID: ${{secrets.AAD_APP_ID}} | ||
# AAD_ROLE_ARN: ${{secrets.AAD_ROLE_ARN}} | ||
# AAD_IDP_ARN: ${{secrets.AAD_IDP_ARN}} | ||
# AAD_TENANT: ${{secrets.AAD_TENANT}} | ||
# AAD_USER: ${{secrets.AAD_USER}} | ||
# AAD_USER_PWD: ${{secrets.AAD_USER_PWD}} | ||
# AAD_CLIENT_SECRET: ${{secrets.AAD_CLIENT_SECRET}} | ||
|
||
# # OKTA Test environment variables. Uncomment out to use GitHub secrets to enable AAD integration tests | ||
# ENABLE_OKTA_TEST: "TRUE" | ||
# OKTA_HOST: ${{secrets.OKTA_HOST}} | ||
# OKTA_USER: ${{secrets.OKTA_USER}} | ||
# OKTA_USER_PWD: ${{secrets.OKTA_USER_PWD}} | ||
# OKTA_APP_ID: ${{secrets.OKTA_APP_ID}} | ||
# OKTA_ROLE_ARN: ${{secrets.OKTA_ROLE_ARN}} | ||
# OKTA_IDP_ARN: ${{secrets.OKTA_IDP_ARN}} | ||
|
||
jobs: | ||
build-linux64: | ||
runs-on: ubuntu-20.04 | ||
steps: | ||
- uses: actions/checkout@v2 | ||
|
||
- name: Get latest version of CMake | ||
uses: lukka/get-cmake@latest | ||
|
||
- name: run-cppcheck | ||
run: | | ||
sudo apt install cppcheck | ||
cd cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver | ||
sh run_cppcheck.sh | ||
|
||
- name: upload-cppcheck-results | ||
if: failure() | ||
uses: actions/upload-artifact@v2 | ||
with: | ||
name: cppcheck-results | ||
path: cppcheck-results.log | ||
|
||
- name: get-dependencies | ||
if: success() | ||
run: | | ||
sudo apt update | ||
sudo apt install libcurl4-openssl-dev libssl-dev uuid-dev zlib1g-dev libpulse-dev linux-headers-$(uname -r) gcc gcc-multilib g++ g++-multilib linux-headers-$(uname -r) build-essential valgrind libboost-all-dev libsasl2-dev lcov | ||
|
||
- name: configure-and-build-driver | ||
run: | | ||
cd cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver | ||
chmod +x build_linux_release64_deb.sh | ||
chmod +x scripts/build_linux.sh | ||
./build_linux_release64_deb.sh | ||
|
||
- name: update-environment-with-ODBC_DRIVER_VERSION | ||
run: | | ||
cd cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver | ||
read -r ODBC_DRIVER_VERSION < ./src/ODBC_DRIVER_VERSION.txt | ||
echo "ODBC_DRIVER_VERSION=$ODBC_DRIVER_VERSION" >> $GITHUB_ENV | ||
|
||
- name: upload-package | ||
uses: actions/upload-artifact@v3 | ||
with: | ||
name: AmazonTimestreamODBC_${{env.ODBC_DRIVER_VERSION}}_RELEASE_amd64 | ||
path: cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/cmake-build64/AmazonTimestreamODBC_${{env.ODBC_DRIVER_VERSION}}_RELEASE_amd64.deb | ||
|
||
- name: register-odbc-driver | ||
run: | | ||
cd cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver | ||
chmod +r -R ${{env.ODBC_LIB_PATH}} | ||
chmod +x scripts/register_driver_unix.sh | ||
sudo bash scripts/register_driver_unix.sh | ||
|
||
- name: run-unit-tests | ||
id: rununittests | ||
run: | | ||
mkdir -p "${{env.TIMESTREAM_LOG_PATH}}" | ||
./cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/build/odbc/bin/timestream-odbc-unit-tests --catch_system_errors=false | ||
|
||
- name: upload-unit-test-report | ||
if: always() | ||
uses: EnricoMi/publish-unit-test-result-action/[email protected] | ||
with: | ||
check_name: "Ubuntu 20.04 Build Unit Test Results Check" | ||
comment_title: "Ubuntu 20.04 Build Unit Test Results" | ||
files: ./odbc_unit_test_result.xml | ||
|
||
- name: upload-odbc-unit-test-file | ||
if: always() && (steps.rununittests.outcome == 'failure') | ||
uses: actions/upload-artifact@v2 | ||
with: | ||
name: odbc-unit-test-results | ||
path: | | ||
./odbc_unit_test_result.xml | ||
./cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/build/odbc/logs/timestream_odbc_*.log | ||
./aws_sdk_*.log | ||
|
||
# TODO re-enable integration tests | ||
# run integration tests | ||
# - name: run-odbc-tests | ||
# id: runodbctests | ||
# run: | | ||
# ./cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/build/odbc/bin/timestream-odbc-integration-tests --catch_system_errors=false | ||
|
||
- name: upload-odbc-test-report | ||
if: always() | ||
uses: EnricoMi/publish-unit-test-result-action/[email protected] | ||
with: | ||
check_name: "Ubuntu 20.04 Build Integration Test Results Check" | ||
comment_title: "Ubuntu 20.04 Build Integration Test Results" | ||
files: ./odbc_test_result.xml | ||
|
||
- name: upload-odbc-test-file | ||
if: always() && (steps.runodbctests.outcome == 'failure') | ||
uses: actions/upload-artifact@v2 | ||
with: | ||
name: odbc-test-results | ||
path: | | ||
./odbc_test_result.xml | ||
./build/odbc/logs/timestream_odbc_*.log | ||
./aws_sdk_*.log | ||
|
||
build-linux64-debug: | ||
runs-on: ubuntu-20.04 | ||
steps: | ||
- uses: actions/checkout@v2 | ||
|
||
- name: Get gcovr | ||
run: | | ||
pip install gcovr | ||
|
||
- name: Get latest version of CMake | ||
uses: lukka/get-cmake@latest | ||
|
||
- name: run-cppcheck | ||
run: | | ||
sudo apt install cppcheck | ||
cd cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver | ||
sh run_cppcheck.sh | ||
|
||
- name: upload-cppcheck-results | ||
if: failure() | ||
uses: actions/upload-artifact@v2 | ||
with: | ||
name: cppcheck-results | ||
path: cppcheck-results.log | ||
|
||
- name: get-dependencies | ||
if: success() | ||
run: | | ||
sudo apt update | ||
sudo apt install libcurl4-openssl-dev libssl-dev uuid-dev zlib1g-dev libpulse-dev linux-headers-$(uname -r) gcc gcc-multilib g++ g++-multilib linux-headers-$(uname -r) build-essential valgrind libboost-all-dev libsasl2-dev lcov | ||
|
||
- name: configure-and-build-driver | ||
run: | | ||
cd cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver | ||
chmod +x build_linux_debug64_deb.sh | ||
chmod +x scripts/build_linux.sh | ||
./build_linux_debug64_deb.sh | ||
|
||
- name: update-environment-with-ODBC_DRIVER_VERSION | ||
run: | | ||
cd cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver | ||
read -r ODBC_DRIVER_VERSION < ./src/ODBC_DRIVER_VERSION.txt | ||
echo "ODBC_DRIVER_VERSION=$ODBC_DRIVER_VERSION" >> $GITHUB_ENV | ||
|
||
- name: upload-package | ||
uses: actions/upload-artifact@v3 | ||
with: | ||
name: AmazonTimestreamODBC_${{env.ODBC_DRIVER_VERSION}}_DEBUG_amd64 | ||
path: cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/cmake-build64/AmazonTimestreamODBC_${{env.ODBC_DRIVER_VERSION}}_DEBUG_amd64.deb | ||
|
||
- name: register-odbc-driver | ||
run: | | ||
cd cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver | ||
chmod +r -R ${{env.ODBC_LIB_PATH}} | ||
chmod +x scripts/register_driver_unix.sh | ||
sudo bash scripts/register_driver_unix.sh | ||
|
||
# run unit tests | ||
- name: run-unit-tests | ||
id: rununittests | ||
run: | | ||
mkdir -p "${{env.TIMESTREAM_LOG_PATH}}" | ||
./cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/build/odbc/bin/timestream-odbc-unit-tests --catch_system_errors=false | ||
gcovr --exclude-directories=cmake-build64/tests/integration-test/CMakeFiles/timestream-odbc-integration-tests.dir$ --exclude-directories=cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/cmake-build64/tests/unit-test/CMakeFiles/timestream-odbc-unit-tests.dir$ --cobertura --output coverage.cobertura.xml | ||
|
||
- name: upload-test-report | ||
if: always() | ||
uses: EnricoMi/publish-unit-test-result-action/[email protected] | ||
with: | ||
check_name: "Ubuntu 20.04 Debug Build Unit Test Results Check" | ||
comment_title: "Ubuntu 20.04 Debug Build Unit Test Results" | ||
files: ./odbc_unit_test_result.xml | ||
|
||
- name: upload-unit-test-file | ||
if: always() && (steps.rununittests.outcome == 'failure') | ||
uses: actions/upload-artifact@v3 | ||
with: | ||
name: odbc-unit-test-results-linux-debug | ||
path: | | ||
./odbc_unit_test_result.xml | ||
./cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/build/odbc/logs/timestream_odbc_*.log | ||
./aws_sdk_*.log | ||
|
||
# TODO re-enable integration tests and code coverage from integration tests | ||
# run odbc tests | ||
# - name: run-odbc-tests | ||
# id: runodbctests | ||
# run: | | ||
# ./cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/build/odbc/bin/timestream-odbc-integration-tests --catch_system_errors=false | ||
# gcovr --exclude-directories=cmake-build64/tests/integration-test/CMakeFiles/timestream-odbc-integration-tests.dir$ --exclude-directories=cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/cmake-build64/tests/unit-test/CMakeFiles/timestream-odbc-unit-tests.dir$ --cobertura --output coverage.cobertura.xml | ||
|
||
- name: Code Coverage Summary Report | ||
uses: irongut/[email protected] | ||
with: | ||
filename: coverage.cobertura.xml | ||
badge: true | ||
format: markdown | ||
indicators: true | ||
output: both | ||
|
||
- name: Add Header for Code Coverage Summary Report | ||
run: | | ||
echo "## Ubuntu 20.04 Debug Build Test Results Check" > coverage-cobertura.md | ||
cat code-coverage-results.md >> coverage-cobertura.md | ||
|
||
- name: Add Coverage PR Comment | ||
uses: marocchino/sticky-pull-request-comment@v2 | ||
if: github.event_name == 'pull_request' | ||
with: | ||
header: linux | ||
recreate: true | ||
path: coverage-cobertura.md | ||
|
||
- name: upload-odbc-test-report | ||
if: always() | ||
uses: EnricoMi/publish-unit-test-result-action/[email protected] | ||
with: | ||
check_name: "Ubuntu 20.04 Debug Build Integration Test Results Check" | ||
comment_title: "Ubuntu 20.04 Debug Build Integration Test Results" | ||
files: ./odbc_test_result.xml | ||
|
||
- name: upload-odbc-test-file | ||
if: always() && (steps.runodbctests.outcome == 'failure') | ||
uses: actions/upload-artifact@v3 | ||
with: | ||
name: odbc-test-results-linux-debug | ||
path: | | ||
./odbc_test_result.xml | ||
./cpp/src/arrow/flight/sql/amazon-timestream-odbc-driver/build/odbc/logs/timestream_odbc_*.log | ||
./aws_sdk_*.log | ||
|
||
- name: upload-coverage | ||
if: always() | ||
uses: actions/upload-artifact@v3 | ||
with: | ||
name: code-coverage-Linux | ||
path: cmake-build64/ccov/all-merged |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
linux-build.yml
?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Addressed in #7 (comment)