Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

doc: Clean up supported JDKs in README #366

Merged
merged 4 commits into from
May 1, 2024
Merged
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ Linux, Apple OSX (Intel and M1)
## Requirements

- Apache Spark 3.2, 3.3, or 3.4
- JDK 8 and up
- JDK 11 and up (JDK 8 should be supported, but development is on JDK 11)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should say that we only support 8, 11, and 17 rather than say JDK 11 and up. We do not support JDK 21 for example (neither does Spark 3.x).

I think we should recommend 11 for now because Spark 3.2 does not support JDK 17

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think for sure option is to use supported JDKs with the Spark version you are working on. It is good to have it clean here to reduce confusion.

- GLIBC 2.17 (Centos 7) and up

## Getting started
Expand Down