-
Notifications
You must be signed in to change notification settings - Fork 28.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-44112][BUILD][INFRA][DOCS] Drop support for Java 8 and Java 11 #43005
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you, @LuciferYang .
@dongjoon-hyun I'd like to discuss with you that https://github.com/LuciferYang/spark/actions/runs/6243680485/job/16949769483 has tested Java 17 + Scala 2.12.18, and a lot of tests failed after changing I can reproduce these errors locally,
but when we change the Scala version to Scala 2.13 and then test with Java 17 again,
all test successful. If this is the case, can we drop Scala 2.12 supports first, then upgrade the Java version? This seems to be a bit easier. |
+1 for switching to Scala 2.13 first. Please lead the activity. I trust your domain expertise. :)
|
OK ~ |
wait #43008 |
#43008 is merged. Shall we resume this? |
I'll update this PR later. |
@@ -27,7 +27,7 @@ license: | | |||
## Apache Maven | |||
|
|||
The Maven-based build is the build of reference for Apache Spark. | |||
Building Spark using Maven requires Maven 3.9.4 and Java 8/11/17. | |||
Building Spark using Maven requires Maven 3.9.4 and Java 17. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we add Support for Java 8/11 was removed in Spark 4.0.0.
.github/workflows/build_and_test.yml
Outdated
@@ -649,11 +649,11 @@ jobs: | |||
if [ -f ./dev/free_disk_space_container ]; then | |||
./dev/free_disk_space_container | |||
fi | |||
- name: Install Java 8 | |||
- name: Install Java 17 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I need to think about this part a bit more. I just recalled that branch-3.x will also use this yml, so we may need to add some if conditions.
@@ -649,11 +649,11 @@ jobs: | |||
if [ -f ./dev/free_disk_space_container ]; then | |||
./dev/free_disk_space_container | |||
fi | |||
- name: Install Java 8 | |||
- name: Install Java ${{ inputs.java }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems easier to install Java this way, the daily test of branch-3.x all with java: 8
Work not included in this PR:
Since I can't successfully run So, I am planning to create two additional Jira tickets to track these two issues, is that okay? |
Could you review this @dongjoon-hyun @HyukjinKwon @srowen Thanks |
@@ -30,7 +30,7 @@ RUN apt-get update && apt-get install -y \ | |||
pkg-config \ | |||
curl \ | |||
wget \ | |||
openjdk-8-jdk \ | |||
openjdk-17-jdk-headless \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dongjoon-hyun @HyukjinKwon should we change this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @Yikun
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dongjoon-hyun So in this PR, should we revert this change and consider it as a followup?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Emm, jdk install in here seems useless, because we are using github action to install the java. But if CI passed, it is ok.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @Yikun ~ Let me continue to monitor the running status of GA.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, the AS-IS PR looks good to me, @LuciferYang . We can do the follow-ups.
- The Spark release docker files
- Infra docker files
- AppVeyor
- Other documentations including
README.md
IIUC, the main change passed in c0a330d already. |
Merged to master for Apache Spark 4.0.0. Thank you, @LuciferYang and all. |
Thanks @dongjoon-hyun @HyukjinKwon @bjornjorgensen and @cfmcgrady ~ |
### What changes were proposed in this pull request? This pr aims change Maven daily test use Java 17 for testing. ### Why are the changes needed? #43005 drop support for Java 8 and Java 11, so Maven daily test should also test with Java 17 as default. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Monitor Maven daily test GA task ### Was this patch authored or co-authored using generative AI tooling? No Closes #43057 from LuciferYang/SPARK-45280. Authored-by: yangjie01 <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
### What changes were proposed in this pull request? This PR aims to fix `docker-image-tool.sh` to be up-to-date. ### Why are the changes needed? Apache Spark 4 dropped Java 11 support. So, we should fix the following. - #43005 ``` - - Build and push Java11-based image with tag "v3.4.0" to docker.io/myrepo + - Build and push Java17-based image with tag "v4.0.0" to docker.io/myrepo ``` Apache Spark 4 requires JDK instead of JRE. So, we should fix the following. - #45761 ``` - $0 -r docker.io/myrepo -t v3.4.0 -b java_image_tag=11-jre build + $0 -r docker.io/myrepo -t v4.0.0 -b java_image_tag=17 build ``` Lastly, `3.4.0` is too old because it's released on April 13, 2023. We had better use v4.0.0. ``` - $0 -r docker.io/myrepo -t v3.4.0 -p kubernetes/dockerfiles/spark/bindings/python/Dockerfile build + $0 -r docker.io/myrepo -t v4.0.0 -p kubernetes/dockerfiles/spark/bindings/python/Dockerfile build ``` ### Does this PR introduce _any_ user-facing change? No functional change because this is a usage message. ### How was this patch tested? Manual review. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #47618 from dongjoon-hyun/SPARK-49117. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]>
### What changes were proposed in this pull request? This PR aims to fix `docker-image-tool.sh` to be up-to-date. ### Why are the changes needed? Apache Spark 4 dropped Java 11 support. So, we should fix the following. - apache#43005 ``` - - Build and push Java11-based image with tag "v3.4.0" to docker.io/myrepo + - Build and push Java17-based image with tag "v4.0.0" to docker.io/myrepo ``` Apache Spark 4 requires JDK instead of JRE. So, we should fix the following. - apache#45761 ``` - $0 -r docker.io/myrepo -t v3.4.0 -b java_image_tag=11-jre build + $0 -r docker.io/myrepo -t v4.0.0 -b java_image_tag=17 build ``` Lastly, `3.4.0` is too old because it's released on April 13, 2023. We had better use v4.0.0. ``` - $0 -r docker.io/myrepo -t v3.4.0 -p kubernetes/dockerfiles/spark/bindings/python/Dockerfile build + $0 -r docker.io/myrepo -t v4.0.0 -p kubernetes/dockerfiles/spark/bindings/python/Dockerfile build ``` ### Does this PR introduce _any_ user-facing change? No functional change because this is a usage message. ### How was this patch tested? Manual review. ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#47618 from dongjoon-hyun/SPARK-49117. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]>
### What changes were proposed in this pull request? This PR aims to fix `docker-image-tool.sh` to be up-to-date. ### Why are the changes needed? Apache Spark 4 dropped Java 11 support. So, we should fix the following. - apache#43005 ``` - - Build and push Java11-based image with tag "v3.4.0" to docker.io/myrepo + - Build and push Java17-based image with tag "v4.0.0" to docker.io/myrepo ``` Apache Spark 4 requires JDK instead of JRE. So, we should fix the following. - apache#45761 ``` - $0 -r docker.io/myrepo -t v3.4.0 -b java_image_tag=11-jre build + $0 -r docker.io/myrepo -t v4.0.0 -b java_image_tag=17 build ``` Lastly, `3.4.0` is too old because it's released on April 13, 2023. We had better use v4.0.0. ``` - $0 -r docker.io/myrepo -t v3.4.0 -p kubernetes/dockerfiles/spark/bindings/python/Dockerfile build + $0 -r docker.io/myrepo -t v4.0.0 -p kubernetes/dockerfiles/spark/bindings/python/Dockerfile build ``` ### Does this PR introduce _any_ user-facing change? No functional change because this is a usage message. ### How was this patch tested? Manual review. ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#47618 from dongjoon-hyun/SPARK-49117. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]>
### What changes were proposed in this pull request? This PR aims to fix `docker-image-tool.sh` to be up-to-date. ### Why are the changes needed? Apache Spark 4 dropped Java 11 support. So, we should fix the following. - apache#43005 ``` - - Build and push Java11-based image with tag "v3.4.0" to docker.io/myrepo + - Build and push Java17-based image with tag "v4.0.0" to docker.io/myrepo ``` Apache Spark 4 requires JDK instead of JRE. So, we should fix the following. - apache#45761 ``` - $0 -r docker.io/myrepo -t v3.4.0 -b java_image_tag=11-jre build + $0 -r docker.io/myrepo -t v4.0.0 -b java_image_tag=17 build ``` Lastly, `3.4.0` is too old because it's released on April 13, 2023. We had better use v4.0.0. ``` - $0 -r docker.io/myrepo -t v3.4.0 -p kubernetes/dockerfiles/spark/bindings/python/Dockerfile build + $0 -r docker.io/myrepo -t v4.0.0 -p kubernetes/dockerfiles/spark/bindings/python/Dockerfile build ``` ### Does this PR introduce _any_ user-facing change? No functional change because this is a usage message. ### How was this patch tested? Manual review. ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#47618 from dongjoon-hyun/SPARK-49117. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]>
The main purpose of this pr is to remove support for Java 8 and Java 11 in Apache Spark 4.0, the specific work includes: 1. `pom.xml`: change `java.version` from 1.8 to 17, change `target:jvm-1.8` to `target:17` 2. `SparkBuild.scala`: change `-target:jvm-${javaVersion.value}` to `-target:${javaVersion.value}` 3. workflow files: change the default Java version from 8 to 17, and ensure that branch-3.x still uses Java 8. Removed the daily job for Java 11 and Java 17. 4. docs: replace parts of Java 8 and 11 with Java 17. The minimum supported Java version for Apache Spark 4.0 is Java 17 Yes, Apache will no longer support Java 8 and Java 11 - Pass Github Actions No Closes apache#43005 from LuciferYang/SPARK-44112. Authored-by: yangjie01 <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
What changes were proposed in this pull request?
The main purpose of this pr is to remove support for Java 8 and Java 11 in Apache Spark 4.0, the specific work includes:
pom.xml
: changejava.version
from 1.8 to 17, changetarget:jvm-1.8
totarget:17
SparkBuild.scala
: change-target:jvm-${javaVersion.value}
to-target:${javaVersion.value}
infra/Dockerfile
: change to installopenjdk-17-jdk-headless
Why are the changes needed?
The minimum supported Java version for Apache Spark 4.0 is Java 17
Does this PR introduce any user-facing change?
Yes, Apache will no longer support Java 8 and Java 11
How was this patch tested?
Was this patch authored or co-authored using generative AI tooling?
No