-
Notifications
You must be signed in to change notification settings - Fork 166
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
doc: Clean up supported JDKs in README #366
Conversation
Update docs until JDK 8 support confirmed
README.md
Outdated
@@ -63,7 +63,7 @@ Linux, Apple OSX (Intel and M1) | |||
## Requirements | |||
|
|||
- Apache Spark 3.2, 3.3, or 3.4 | |||
- JDK 8 and up | |||
- JDK 11 and up (JDK 8 should be supported, but development is on JDK 11) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should say that we only support 8, 11, and 17 rather than say JDK 11 and up. We do not support JDK 21 for example (neither does Spark 3.x).
I think we should recommend 11 for now because Spark 3.2 does not support JDK 17
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think for sure option is to use supported JDKs with the Spark version you are working on. It is good to have it clean here to reduce confusion.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @edmondop
Thanks @edmondop |
Which issue does this PR close?
Closes #.
Rationale for this change
What changes are included in this PR?
How are these changes tested?