From 8e5b7c510b528ecc0a0b0bd4c33b0ed86432e25c Mon Sep 17 00:00:00 2001 From: Trent Hauck Date: Tue, 14 May 2024 08:19:46 -0700 Subject: [PATCH] docs: fix various sphinx warnings --- docs/README.md | 10 +++--- docs/source/contributor-guide/debugging.md | 32 +++++++++---------- docs/source/index.rst | 4 +-- docs/source/user-guide/installation.md | 10 +++--- .../compatibility-template.md | 0 .../configs-template.md | 0 .../scala/org/apache/comet/GenerateDocs.scala | 4 +-- 7 files changed, 30 insertions(+), 30 deletions(-) rename docs/{source/user-guide => templates}/compatibility-template.md (100%) rename docs/{source/user-guide => templates}/configs-template.md (100%) diff --git a/docs/README.md b/docs/README.md index ca33315ee..031037b30 100644 --- a/docs/README.md +++ b/docs/README.md @@ -20,7 +20,7 @@ # Apache DataFusion Comet Documentation This folder contains the source content for the Apache DataFusion Comet documentation site. This content is published -to https://datafusion.apache.org/comet when any changes are merged into the main branch. +to when any changes are merged into the main branch. ## Dependencies @@ -54,15 +54,15 @@ automatically updated. ## Release Process -This documentation is hosted at https://datafusion.apache.org/comet/ +This documentation is hosted at When the PR is merged to the `main` branch of the `datafusion-comet` -repository, a [github workflow](https://github.com/apache/datafusion-comet/blob/main/.github/workflows/docs.yaml) which: +repository, a [GitHub workflow](https://github.com/apache/datafusion-comet/blob/main/.github/workflows/docs.yaml) which: 1. Builds the html content 2. Pushes the html content to the [`asf-site`](https://github.com/apache/datafusion-comet/tree/asf-site) branch in this repository. -The Apache Software Foundation provides https://datafusion.apache.org/, +The Apache Software Foundation provides , which serves content based on the configuration in [.asf.yaml](https://github.com/apache/datafusion-comet/blob/main/.asf.yaml), -which specifies the target as https://datafusion.apache.org/comet/. +which specifies the target as . diff --git a/docs/source/contributor-guide/debugging.md b/docs/source/contributor-guide/debugging.md index 38c396c15..d1f62a5db 100644 --- a/docs/source/contributor-guide/debugging.md +++ b/docs/source/contributor-guide/debugging.md @@ -21,9 +21,9 @@ under the License. This HOWTO describes how to debug JVM code and Native code concurrently. The guide assumes you have: -1. Intellij as the Java IDE +1. IntelliJ as the Java IDE 2. CLion as the Native IDE. For Rust code, the CLion Rust language plugin is required. Note that the - Intellij Rust plugin is not sufficient. + IntelliJ Rust plugin is not sufficient. 3. CLion/LLDB as the native debugger. CLion ships with a bundled LLDB and the Rust community has its own packaging of LLDB (`lldb-rust`). Both provide a better display of Rust symbols than plain LLDB or the LLDB that is bundled with XCode. We will use the LLDB packaged with CLion for this guide. @@ -36,7 +36,7 @@ _Caveat: The steps here have only been tested with JDK 11_ on Mac (M1) Add a `.lldbinit` to comet/core. This is not strictly necessary but will be useful if you want to use advanced `lldb` debugging. -### In Intellij +### In IntelliJ 1. Set a breakpoint in `NativeBase.load()`, at a point _after_ the Comet library has been loaded. @@ -48,7 +48,7 @@ use advanced `lldb` debugging. 1. Add a println to the unit test to print the PID of the JVM process. (jps can also be used but this is less error prone if you have multiple jvm processes running) - ```JDK8 + ```scala println("Waiting for Debugger: PID - ", ManagementFactory.getRuntimeMXBean().getName()) ``` @@ -56,31 +56,31 @@ use advanced `lldb` debugging. For JDK9 and newer - ```JDK9 + ```scala println("Waiting for Debugger: PID - ", ProcessHandle.current.pid) ``` ==> Note the PID -1. Debug-run the test in Intellij and wait for the breakpoint to be hit +1. Debug-run the test in IntelliJ and wait for the breakpoint to be hit ### In CLion -1. After the breakpoint is hit in Intellij, in Clion (or LLDB from terminal or editor) - +1. After the breakpoint is hit in IntelliJ, in Clion (or LLDB from terminal or editor) - 1. Attach to the jvm process (make sure the PID matches). In CLion, this is `Run -> Atttach to process` 1. Put your breakpoint in the native code -1. Go back to intellij and resume the process. +1. Go back to IntelliJ and resume the process. -1. Most debugging in CLion is similar to Intellij. For advanced LLDB based debugging the LLDB command line can be accessed from the LLDB tab in the Debugger view. Refer to the [LLDB manual](https://lldb.llvm.org/use/tutorial.html) for LLDB commands. +1. Most debugging in CLion is similar to IntelliJ. For advanced LLDB based debugging the LLDB command line can be accessed from the LLDB tab in the Debugger view. Refer to the [LLDB manual](https://lldb.llvm.org/use/tutorial.html) for LLDB commands. -### After your debugging is done, +### After your debugging is done 1. In CLion, detach from the process if not already detached -2. In Intellij, the debugger might have lost track of the process. If so, the debugger tab +2. In IntelliJ, the debugger might have lost track of the process. If so, the debugger tab will show the process as running (even if the test/job is shown as completed). 3. Close the debugger tab, and if the IDS asks whether it should terminate the process, @@ -94,10 +94,10 @@ use advanced `lldb` debugging. ### Additional Info OpenJDK mailing list on debugging the JDK on MacOS -https://mail.openjdk.org/pipermail/hotspot-dev/2019-September/039429.html + Detecting the debugger -https://stackoverflow.com/questions/5393403/can-a-java-application-detect-that-a-debugger-is-attached#:~:text=No.,to%20let%20your%20app%20continue.&text=I%20know%20that%20those%20are,meant%20with%20my%20first%20phrase). +). ## Verbose debug @@ -117,10 +117,10 @@ This was likely caused by a bug in DataFusion's code and we would welcome that y ``` There is a verbose exception option by leveraging DataFusion [backtraces](https://arrow.apache.org/datafusion/user-guide/example-usage.html#enable-backtraces) -This option allows to append native DataFusion stacktrace to the original error message. +This option allows to append native DataFusion stack trace to the original error message. To enable this option with Comet it is needed to include `backtrace` feature in [Cargo.toml](https://github.com/apache/arrow-datafusion-comet/blob/main/core/Cargo.toml) for DataFusion dependencies -``` +```toml datafusion-common = { version = "36.0.0", features = ["backtrace"] } datafusion = { default-features = false, version = "36.0.0", features = ["unicode_expressions", "backtrace"] } ``` @@ -129,7 +129,7 @@ Then build the Comet as [described](https://github.com/apache/arrow-datafusion-c Start Comet with `RUST_BACKTRACE=1` -```commandline +```console RUST_BACKTRACE=1 $SPARK_HOME/spark-shell --jars spark/target/comet-spark-spark3.4_2.12-0.1.0-SNAPSHOT.jar --conf spark.sql.extensions=org.apache.comet.CometSparkSessionExtensions --conf spark.comet.enabled=true --conf spark.comet.exec.enabled=true --conf spark.comet.exec.all.enabled=true ``` diff --git a/docs/source/index.rst b/docs/source/index.rst index 5759fcf40..0db282ea8 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -35,7 +35,7 @@ Apache DataFusion Comet Apache DataFusion Comet is an Apache Spark plugin that uses Apache DataFusion as a native runtime to achieve improvement in terms of query efficiency and query runtime. -.. _toc.links: +.. _toc.user-guide-links: .. toctree:: :maxdepth: 1 :caption: User Guide @@ -48,7 +48,7 @@ as a native runtime to achieve improvement in terms of query efficiency and quer Configuration Settings Compatibility Guide -.. _toc.links: +.. _toc.contributor-guide-links: .. toctree:: :maxdepth: 1 :caption: Contributor Guide diff --git a/docs/source/user-guide/installation.md b/docs/source/user-guide/installation.md index e9149019e..efc58f1e4 100644 --- a/docs/source/user-guide/installation.md +++ b/docs/source/user-guide/installation.md @@ -40,20 +40,20 @@ There are no public releases available yet, so it is necessary to build from sou Clone the repository: -```commandline +```console git clone https://github.com/apache/datafusion-comet.git ``` Build Comet for a specific Spark version: -```commandline +```console cd datafusion-comet make release PROFILES="-Pspark-3.4" ``` Note that the project builds for Scala 2.12 by default but can be built for Scala 2.13 using an additional profile: -```commandline +```console make release PROFILES="-Pspark-3.4 -Pscala-2.13" ``` @@ -61,7 +61,7 @@ make release PROFILES="-Pspark-3.4 -Pscala-2.13" Make sure `SPARK_HOME` points to the same Spark version as Comet was built for. -```commandline +```console $SPARK_HOME/bin/spark-shell \ --jars spark/target/comet-spark-spark3.4_2.12-0.1.0-SNAPSHOT.jar \ --conf spark.sql.extensions=org.apache.comet.CometSparkSessionExtensions \ @@ -128,7 +128,7 @@ components which will then fail at runtime. For example: --driver-class-path spark/target/comet-spark-spark3.4_2.12-0.1.0-SNAPSHOT.jar ``` -Some cluster managers may require additional configuration, see https://spark.apache.org/docs/latest/cluster-overview.html +Some cluster managers may require additional configuration, see To enable columnar shuffle which supports all partitioning and basic complex types, one more config is required: diff --git a/docs/source/user-guide/compatibility-template.md b/docs/templates/compatibility-template.md similarity index 100% rename from docs/source/user-guide/compatibility-template.md rename to docs/templates/compatibility-template.md diff --git a/docs/source/user-guide/configs-template.md b/docs/templates/configs-template.md similarity index 100% rename from docs/source/user-guide/configs-template.md rename to docs/templates/configs-template.md diff --git a/spark/src/main/scala/org/apache/comet/GenerateDocs.scala b/spark/src/main/scala/org/apache/comet/GenerateDocs.scala index 1e28efd52..a2d5e2515 100644 --- a/spark/src/main/scala/org/apache/comet/GenerateDocs.scala +++ b/spark/src/main/scala/org/apache/comet/GenerateDocs.scala @@ -40,7 +40,7 @@ object GenerateDocs { } private def generateConfigReference(): Unit = { - val templateFilename = "docs/source/user-guide/configs-template.md" + val templateFilename = "docs/templates/configs-template.md" val outputFilename = "docs/source/user-guide/configs.md" val w = new BufferedOutputStream(new FileOutputStream(outputFilename)) for (line <- Source.fromFile(templateFilename).getLines()) { @@ -60,7 +60,7 @@ object GenerateDocs { } private def generateCompatibilityGuide(): Unit = { - val templateFilename = "docs/source/user-guide/compatibility-template.md" + val templateFilename = "docs/templates/compatibility-template.md" val outputFilename = "docs/source/user-guide/compatibility.md" val w = new BufferedOutputStream(new FileOutputStream(outputFilename)) for (line <- Source.fromFile(templateFilename).getLines()) {