-
Notifications
You must be signed in to change notification settings - Fork 169
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
build: Separate and speedup TPC-DS benchmark #130
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,158 @@ | ||
# Licensed to the Apache Software Foundation (ASF) under one | ||
# or more contributor license agreements. See the NOTICE file | ||
# distributed with this work for additional information | ||
# regarding copyright ownership. The ASF licenses this file | ||
# to you under the Apache License, Version 2.0 (the | ||
# "License"); you may not use this file except in compliance | ||
# with the License. You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, | ||
# software distributed under the License is distributed on an | ||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY | ||
# KIND, either express or implied. See the License for the | ||
# specific language governing permissions and limitations | ||
# under the License. | ||
|
||
name: Run TPC-DS Benchmark | ||
|
||
concurrency: | ||
group: ${{ github.repository }}-${{ github.head_ref || github.sha }}-${{ github.workflow }} | ||
cancel-in-progress: true | ||
|
||
on: | ||
push: | ||
paths-ignore: | ||
- "doc/**" | ||
- "**.md" | ||
pull_request: | ||
paths-ignore: | ||
- "doc/**" | ||
- "**.md" | ||
# manual trigger | ||
# https://docs.github.com/en/actions/managing-workflow-runs/manually-running-a-workflow | ||
workflow_dispatch: | ||
|
||
env: | ||
RUST_VERSION: nightly | ||
|
||
jobs: | ||
prepare: | ||
name: Build native lib and prepare TPC-DS data | ||
runs-on: ubuntu-latest | ||
container: | ||
image: amd64/rust | ||
env: | ||
JAVA_VERSION: 11 | ||
steps: | ||
- uses: actions/checkout@v4 | ||
- name: Setup Rust & Java toolchain | ||
uses: ./.github/actions/setup-builder | ||
with: | ||
rust-version: ${{env.RUST_VERSION}} | ||
jdk-version: 11 | ||
- name: Cache Maven dependencies | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. reuse Maven cache. |
||
uses: actions/cache@v4 | ||
with: | ||
path: | | ||
~/.m2/repository | ||
/root/.m2/repository | ||
key: ${{ runner.os }}-java-maven-${{ hashFiles('**/pom.xml') }} | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Does Java version affect this maven cache? We have os in key but seems Java version isn't there. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think it doesn't matter that much as most jars are java version agnostic? And this cache will share the PR build's maven dependency.. Another thing to add: github's cache has quota limit: 10GB. The cached maven dependencies are already quite big(~1GB). If we add the java version to the key, it may create too much maven cache dependencies. |
||
restore-keys: | | ||
${{ runner.os }}-java-maven- | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I wonder why we have a There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The key is
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I wonder why we use this prefix restore key? If There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
For maven dependencies, even if There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Okay There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. |
||
|
||
- name: Cache TPC-DS generated data | ||
id: cache-tpcds-sf-1 | ||
uses: actions/cache@v4 | ||
with: | ||
path: ./tpcds-sf-1 | ||
key: tpcds-${{ hashFiles('.github/workflows/benchmark.yml') }} | ||
- name: Checkout tpcds-kit repository | ||
if: steps.cache-tpcds-sf-1.outputs.cache-hit != 'true' | ||
uses: actions/checkout@v4 | ||
with: | ||
repository: databricks/tpcds-kit | ||
path: ./tpcds-kit | ||
- name: Build Comet | ||
run: make release | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It's quite slow to build under release profile. We may find a way to speed up rust build. But it's sufficient for now, we can do that later. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think it is mostly because we uses |
||
- name: Upload Comet native lib | ||
uses: actions/upload-artifact@v4 | ||
with: | ||
name: libcomet-${{ github.run_id }} | ||
path: | | ||
core/target/release/libcomet.so | ||
core/target/release/libcomet.dylib | ||
retention-days: 1 # remove the artifact after 1 day, only valid for this workflow | ||
overwrite: true | ||
- name: Build tpcds-kit | ||
if: steps.cache-tpcds-sf-1.outputs.cache-hit != 'true' | ||
run: | | ||
apt-get install -y yacc bison flex | ||
cd tpcds-kit/tools && make OS=LINUX | ||
- name: Generate TPC-DS (SF=1) table data | ||
if: steps.cache-tpcds-sf-1.outputs.cache-hit != 'true' | ||
run: | | ||
cd spark && MAVEN_OPTS='-Xmx20g' ../mvnw exec:java -Dexec.mainClass="org.apache.spark.sql.GenTPCDSData" -Dexec.classpathScope="test" -Dexec.cleanupDaemonThreads="false" -Dexec.args="--dsdgenDir `pwd`/../tpcds-kit/tools --location `pwd`/../tpcds-sf-1 --scaleFactor 1 --numPartitions 1" | ||
cd .. | ||
|
||
benchmark: | ||
name: Run TPC-DS benchmark | ||
runs-on: ubuntu-latest | ||
needs: [prepare] | ||
container: | ||
image: amd64/rust | ||
strategy: | ||
matrix: | ||
join: [sort_merge, broadcast, hash] | ||
steps: | ||
- uses: actions/checkout@v4 | ||
- name: Setup Rust & Java toolchain | ||
uses: ./.github/actions/setup-builder | ||
with: | ||
rust-version: ${{env.RUST_VERSION}} | ||
jdk-version: 11 | ||
- name: Cache Maven dependencies | ||
uses: actions/cache@v4 | ||
with: | ||
path: | | ||
~/.m2/repository | ||
/root/.m2/repository | ||
key: ${{ runner.os }}-java-maven-${{ hashFiles('**/pom.xml') }} | ||
restore-keys: | | ||
${{ runner.os }}-java-maven- | ||
- name: Restore TPC-DS generated data | ||
id: cache-tpcds-sf-1 | ||
uses: actions/cache@v4 | ||
Comment on lines
+124
to
+126
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can we just restore it using restore action? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Could you point me the link to the restore action? I searched github but didn't find one. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I see, thanks. Will fix it. |
||
with: | ||
path: ./tpcds-sf-1 | ||
key: tpcds-${{ hashFiles('.github/workflows/benchmark.yml') }} | ||
fail-on-cache-miss: true # it's always be cached as it should be generated by pre-step if not existed | ||
- name: Download Comet native lib | ||
uses: actions/download-artifact@v4 | ||
with: | ||
name: libcomet-${{ github.run_id }} | ||
path: core/target/release | ||
- name: Run TPC-DS queries (Sort merge join) | ||
if: matrix.join == 'sort_merge' | ||
run: | | ||
SPARK_HOME=`pwd` SPARK_TPCDS_DATA=`pwd`/tpcds-sf-1 ./mvnw -B -Prelease -Dsuites=org.apache.spark.sql.CometTPCDSQuerySuite test | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
And I noticed that the java tests are also executed but they are executed quickly. |
||
env: | ||
SPARK_TPCDS_JOIN_CONF: | | ||
spark.sql.autoBroadcastJoinThreshold=-1 | ||
spark.sql.join.preferSortMergeJoin=true | ||
- name: Run TPC-DS queries (Broadcast hash join) | ||
if: matrix.join == 'broadcast' | ||
run: | | ||
SPARK_HOME=`pwd` SPARK_TPCDS_DATA=`pwd`/tpcds-sf-1 ./mvnw -B -Prelease -Dsuites=org.apache.spark.sql.CometTPCDSQuerySuite test | ||
env: | ||
SPARK_TPCDS_JOIN_CONF: | | ||
spark.sql.autoBroadcastJoinThreshold=10485760 | ||
- name: Run TPC-DS queries (Shuffled hash join) | ||
if: matrix.join == 'hash' | ||
run: | | ||
SPARK_HOME=`pwd` SPARK_TPCDS_DATA=`pwd`/tpcds-sf-1 ./mvnw -B -Prelease -Dsuites=org.apache.spark.sql.CometTPCDSQuerySuite test | ||
env: | ||
SPARK_TPCDS_JOIN_CONF: | | ||
spark.sql.autoBroadcastJoinThreshold=-1 | ||
spark.sql.join.forceApplyShuffledHashJoin=true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not actually a benchmark, but correctness check (with Spark result).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm, let me submit a follow up PR to fix it then..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought about this too when reviewing this PR, but decided it is fine since we could re-purpose this workflow in future for both benchmarking and correctness check. It's fine for me if we decide to change it or not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay