├── common <- common Java/Scala code
├── conf <- configuration files
├── core <- core native code, in Rust
├── spark <- Spark integration
- Make sure
JAVA_HOME
is set and point to JDK 8/11/17 installation. - Install Rust toolchain. The easiest way is to use rustup.
A few common commands are specified in project's Makefile
:
make
: compile the entire project, but don't run testsmake test-rust
: compile the project and run tests in Rust sidemake test-java
: compile the project and run tests in Java sidemake test
: compile the project and run tests in both Rust and Java side.make release
: compile the project and creates a release build. This is useful when you want to test Comet local installation in another project such as Spark.make clean
: clean up the workspacebin/comet-spark-shell -d . -o spark/target/
run Comet spark shell for V1 datasourcesbin/comet-spark-shell -d . -o spark/target/ --conf spark.sql.sources.useV1SourceList=""
run Comet spark shell for V2 datasources
Comet is a multi-language project with native code written in Rust and JVM code written in Java and Scala. For Rust code, the CLion IDE is recommended. For JVM code, IntelliJ IDEA is recommended.
Before opening the project in an IDE, make sure to run make
first to generate the necessary files for the IDEs. Currently, it's mostly about
generating protobuf message classes for the JVM side. It's only required to run make
once after cloning the repo.
First make sure to install the Scala plugin in IntelliJ IDEA. After that, you can open the project in IntelliJ IDEA. The IDE should automatically detect the project structure and import as a Maven project.
First make sure to install the Rust plugin in CLion or you can use the dedicated Rust IDE: RustRover. After that you can open the project in CLion. The IDE should automatically detect the project structure and import as a Cargo project.
Like other Maven projects, you can run tests in IntelliJ IDEA by right-clicking on the test class or test method and selecting "Run" or "Debug".
However if the tests is related to the native side. Please make sure to run make core
or cd core && cargo build
before running the tests in IDEA.
There's a make
command to run micro benchmarks in the repo. For
instance:
make benchmark-org.apache.spark.sql.benchmark.CometReadBenchmark
To run TPC-H or TPC-DS micro benchmarks, please follow the instructions
in the respective source code, e.g., CometTPCHQueryBenchmark
.
Comet is a multi-language project with native code written in Rust and JVM code written in Java and Scala. It is possible to debug both native and JVM code concurrently as described in the DEBUGGING guide
Comet uses cargo fmt
, Scalafix and Spotless to
automatically format the code. Before submitting a pull request, you can simply run make format
to format the code.