Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-44112][BUILD][INFRA][DOCS] Drop support for Java 8 and Java 11 #43005

Closed
wants to merge 16 commits into from
Closed
Show file tree
Hide file tree
Changes from 9 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/benchmark.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,9 @@ on:
required: true
default: '*'
jdk:
description: 'JDK version: 8, 11, 17 or 21-ea'
description: 'JDK version: 17 or 21-ea'
required: true
default: '8'
default: '17'
scala:
description: 'Scala version: 2.12 or 2.13'
required: true
Expand Down
27 changes: 16 additions & 11 deletions .github/workflows/build_and_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ on:
java:
required: false
type: string
default: 8
default: 17
branch:
description: Branch to run the build against
required: false
Expand Down Expand Up @@ -207,6 +207,7 @@ jobs:
GITHUB_PREV_SHA: ${{ github.event.before }}
SPARK_LOCAL_IP: localhost
SKIP_PACKAGING: true
SCALA_PROFILE: scala2.13
steps:
- name: Checkout Spark repository
uses: actions/checkout@v3
Expand Down Expand Up @@ -385,6 +386,7 @@ jobs:
SKIP_PACKAGING: true
METASPACE_SIZE: 1g
BRANCH: ${{ inputs.branch }}
SCALA_PROFILE: scala2.13
steps:
- name: Checkout Spark repository
uses: actions/checkout@v3
Expand Down Expand Up @@ -494,6 +496,7 @@ jobs:
SPARK_LOCAL_IP: localhost
SKIP_MIMA: true
SKIP_PACKAGING: true
SCALA_PROFILE: scala2.13
steps:
- name: Checkout Spark repository
uses: actions/checkout@v3
Expand Down Expand Up @@ -650,11 +653,11 @@ jobs:
if [ -f ./dev/free_disk_space_container ]; then
./dev/free_disk_space_container
fi
- name: Install Java 8
- name: Install Java 17
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I need to think about this part a bit more. I just recalled that branch-3.x will also use this yml, so we may need to add some if conditions.

uses: actions/setup-java@v3
with:
distribution: temurin
java-version: 8
java-version: 17
- name: License test
run: ./dev/check-license
- name: Dependencies test
Expand Down Expand Up @@ -778,7 +781,6 @@ jobs:
fail-fast: false
matrix:
java:
- 11
- 17
- 21-ea
runs-on: ubuntu-22.04
Expand Down Expand Up @@ -865,11 +867,11 @@ jobs:
key: scala-213-coursier-${{ hashFiles('**/pom.xml', '**/plugins.sbt') }}
restore-keys: |
scala-213-coursier-
- name: Install Java 8
- name: Install Java 17
uses: actions/setup-java@v3
with:
distribution: temurin
java-version: 8
java-version: 17
- name: Build with SBT
run: |
./dev/change-scala-version.sh 2.13
Expand All @@ -885,6 +887,7 @@ jobs:
timeout-minutes: 300
env:
SPARK_LOCAL_IP: localhost
SCALA_PROFILE: scala2.13
steps:
- name: Checkout Spark repository
uses: actions/checkout@v3
Expand Down Expand Up @@ -916,11 +919,11 @@ jobs:
key: tpcds-coursier-${{ hashFiles('**/pom.xml', '**/plugins.sbt') }}
restore-keys: |
tpcds-coursier-
- name: Install Java 8
- name: Install Java 17
uses: actions/setup-java@v3
with:
distribution: temurin
java-version: 8
java-version: 17
- name: Cache TPC-DS generated data
id: cache-tpcds-sf-1
uses: actions/cache@v3
Expand Down Expand Up @@ -990,6 +993,7 @@ jobs:
ORACLE_DOCKER_IMAGE_NAME: gvenzl/oracle-xe:21.3.0
SKIP_MIMA: true
SKIP_PACKAGING: true
SCALA_PROFILE: scala2.13
steps:
- name: Checkout Spark repository
uses: actions/checkout@v3
Expand Down Expand Up @@ -1022,11 +1026,11 @@ jobs:
key: docker-integration-coursier-${{ hashFiles('**/pom.xml', '**/plugins.sbt') }}
restore-keys: |
docker-integration-coursier-
- name: Install Java 8
- name: Install Java 17
uses: actions/setup-java@v3
with:
distribution: temurin
java-version: 8
java-version: 17
- name: Run tests
run: |
./dev/run-tests --parallelism 1 --modules docker-integration-tests --included-tags org.apache.spark.tags.DockerTest
Expand Down Expand Up @@ -1110,7 +1114,8 @@ jobs:
kubectl create clusterrolebinding serviceaccounts-cluster-admin --clusterrole=cluster-admin --group=system:serviceaccounts || true
kubectl apply -f https://raw.githubusercontent.com/volcano-sh/volcano/v1.7.0/installer/volcano-development.yaml || true
eval $(minikube docker-env)
build/sbt -Psparkr -Pkubernetes -Pvolcano -Pkubernetes-integration-tests -Dspark.kubernetes.test.driverRequestCores=0.5 -Dspark.kubernetes.test.executorRequestCores=0.2 -Dspark.kubernetes.test.volcanoMaxConcurrencyJobNum=1 -Dtest.exclude.tags=local "kubernetes-integration-tests/test"
dev/change-scala-version.sh 2.13
build/sbt -Pscala-2.13 -Psparkr -Pkubernetes -Pvolcano -Pkubernetes-integration-tests -Dspark.kubernetes.test.driverRequestCores=0.5 -Dspark.kubernetes.test.executorRequestCores=0.2 -Dspark.kubernetes.test.volcanoMaxConcurrencyJobNum=1 -Dtest.exclude.tags=local "kubernetes-integration-tests/test"
- name: Upload Spark on K8S integration tests log files
if: ${{ !success() }}
uses: actions/upload-artifact@v3
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/build_ansi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
# under the License.
#

name: "Build / ANSI (master, Hadoop 3, JDK 8, Scala 2.12)"
name: "Build / ANSI (master, Hadoop 3, JDK 17, Scala 2.12)"

on:
schedule:
Expand All @@ -31,7 +31,7 @@ jobs:
uses: ./.github/workflows/build_and_test.yml
if: github.repository == 'apache/spark'
with:
java: 8
java: 17
branch: master
hadoop: hadoop3
envs: >-
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/build_coverage.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
# under the License.
#

name: "Build / Coverage (master, Scala 2.12, Hadoop 3, JDK 8)"
name: "Build / Coverage (master, Scala 2.12, Hadoop 3, JDK 17)"

on:
schedule:
Expand All @@ -31,7 +31,7 @@ jobs:
uses: ./.github/workflows/build_and_test.yml
if: github.repository == 'apache/spark'
with:
java: 8
java: 17
branch: master
hadoop: hadoop3
envs: >-
Expand Down
49 changes: 0 additions & 49 deletions .github/workflows/build_java11.yml

This file was deleted.

4 changes: 2 additions & 2 deletions .github/workflows/build_rockdb_as_ui_backend.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
# under the License.
#

name: "Build / RocksDB as UI Backend (master, Hadoop 3, JDK 8, Scala 2.12)"
name: "Build / RocksDB as UI Backend (master, Hadoop 3, JDK 17, Scala 2.12)"

on:
schedule:
Expand All @@ -31,7 +31,7 @@ jobs:
uses: ./.github/workflows/build_and_test.yml
if: github.repository == 'apache/spark'
with:
java: 8
java: 17
branch: master
hadoop: hadoop3
envs: >-
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/build_scala213.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
# under the License.
#

name: "Build (master, Scala 2.13, Hadoop 3, JDK 8)"
name: "Build (master, Scala 2.13, Hadoop 3, JDK 17)"

on:
schedule:
Expand All @@ -31,7 +31,7 @@ jobs:
uses: ./.github/workflows/build_and_test.yml
if: github.repository == 'apache/spark'
with:
java: 8
java: 17
branch: master
hadoop: hadoop3
envs: >-
Expand Down
7 changes: 2 additions & 5 deletions .github/workflows/publish_snapshot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,9 +32,6 @@ jobs:
matrix:
branch:
- master
- branch-3.5
- branch-3.4
- branch-3.3
steps:
- name: Checkout Spark repository
uses: actions/checkout@v3
Expand All @@ -47,11 +44,11 @@ jobs:
key: snapshot-maven-${{ hashFiles('**/pom.xml') }}
restore-keys: |
snapshot-maven-
- name: Install Java 8
- name: Install Java 17
uses: actions/setup-java@v3
with:
distribution: temurin
java-version: 8
java-version: 17
- name: Publish snapshot
env:
ASF_USERNAME: ${{ secrets.NEXUS_USER }}
Expand Down
61 changes: 61 additions & 0 deletions .github/workflows/publish_snapshot_java8.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#

name: Publish Snapshot using Java 8

on:
schedule:
- cron: '0 0 * * *'

jobs:
publish-snapshot:
if: github.repository == 'apache/spark'
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
branch:
LuciferYang marked this conversation as resolved.
Show resolved Hide resolved
- branch-3.5
- branch-3.4
- branch-3.3
steps:
- name: Checkout Spark repository
uses: actions/checkout@v3
with:
ref: ${{ matrix.branch }}
- name: Cache Maven local repository
uses: actions/cache@v3
with:
path: ~/.m2/repository
key: snapshot-maven-${{ hashFiles('**/pom.xml') }}
restore-keys: |
snapshot-maven-
- name: Install Java 8
uses: actions/setup-java@v3
with:
distribution: temurin
java-version: 8
- name: Publish snapshot
env:
ASF_USERNAME: ${{ secrets.NEXUS_USER }}
ASF_PASSWORD: ${{ secrets.NEXUS_PW }}
GPG_KEY: "not_used"
GPG_PASSPHRASE: "not_used"
GIT_REF: ${{ matrix.branch }}
run: ./dev/create-release/release-build.sh publish-snapshot
2 changes: 1 addition & 1 deletion docs/building-spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ license: |
## Apache Maven

The Maven-based build is the build of reference for Apache Spark.
Building Spark using Maven requires Maven 3.9.4 and Java 8/11/17.
Building Spark using Maven requires Maven 3.9.4 and Java 17.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we add Support for Java 8/11 was removed in Spark 4.0.0.

Spark requires Scala 2.12/2.13; support for Scala 2.11 was removed in Spark 3.0.0.

### Setting up Maven's Memory Usage
Expand Down
5 changes: 2 additions & 3 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,12 +34,11 @@ source, visit [Building Spark](building-spark.html).

Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS), and it should run on any platform that runs a supported version of Java. This should include JVMs on x86_64 and ARM64. It's easy to run locally on one machine --- all you need is to have `java` installed on your system `PATH`, or the `JAVA_HOME` environment variable pointing to a Java installation.

Spark runs on Java 8/11/17, Scala 2.12/2.13, Python 3.8+, and R 3.5+.
Java 8 prior to version 8u371 support is deprecated as of Spark 3.5.0.
Spark runs on Java 17, Scala 2.12/2.13, Python 3.8+, and R 3.5+.
When using the Scala API, it is necessary for applications to use the same version of Scala that Spark was compiled for.
For example, when using Scala 2.13, use Spark compiled for 2.13, and compile code/applications for Scala 2.13 as well.

For Java 11, setting `-Dio.netty.tryReflectionSetAccessible=true` is required for the Apache Arrow library. This prevents the `java.lang.UnsupportedOperationException: sun.misc.Unsafe or java.nio.DirectByteBuffer.(long, int) not available` error when Apache Arrow uses Netty internally.
For Java 17, setting `-Dio.netty.tryReflectionSetAccessible=true` is required for the Apache Arrow library. This prevents the `java.lang.UnsupportedOperationException: sun.misc.Unsafe or java.nio.DirectByteBuffer.(long, int) not available` error when Apache Arrow uses Netty internally.

# Running the Examples and Shell

Expand Down
6 changes: 3 additions & 3 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,7 @@
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
<java.version>17</java.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
<maven.version>3.9.4</maven.version>
Expand Down Expand Up @@ -2933,7 +2933,7 @@
<arg>-deprecation</arg>
<arg>-feature</arg>
<arg>-explaintypes</arg>
<arg>-target:jvm-1.8</arg>
<arg>-target:17</arg>
<arg>-Xfatal-warnings</arg>
<arg>-Ywarn-unused:imports</arg>
<arg>-P:silencer:globalFilters=.*deprecated.*</arg>
Expand Down Expand Up @@ -3645,7 +3645,7 @@
<arg>-deprecation</arg>
<arg>-feature</arg>
<arg>-explaintypes</arg>
<arg>-target:jvm-1.8</arg>
<arg>-target:17</arg>
<arg>-Wconf:cat=deprecation:wv,any:e</arg>
<arg>-Wunused:imports</arg>
<!--
Expand Down
Loading