-
Notifications
You must be signed in to change notification settings - Fork 28.3k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[SPARK-44113][BUILD][INFRA][DOCS] Drop support for Scala 2.12
### What changes were proposed in this pull request? The main purpose of this PR is to remove support for Scala 2.12 in Apache Spark 4.0, the specific work includes: 1. Run `dev/change-scala-version.sh 2.13` to change the Scala version in all `pom.xml` files to 2.13 2. Cleaned up the parent `pom.xml` configuration related to Scala 2.12 and set the Scala 2.13 configuration as the default 3. Cleaned up the Scala 2.12 configuration in `SparkBuild.scala` 4. Cleaned up the support for Scala 2.12 in `python/run-tests.py` and `dev/run-tests.py` 5. Updated the sections related to Scala 2.12 in `docs/building-spark.md`, `docs/index.md`,`docs/spark-connect-overview.md`,`docs/storage-openstack-swift.md`,`docs/_config.yml` to Scala 2.13 6. Updated the parts related to Scala 2.12 in `dev/test-dependencies.sh`, `dev/scalafmt`, `dev/mima`,`dev/lint-scala` to Scala 2.13 7. Removed the support for Scala 2.12 in `dev/change-scala-version.sh` and cleaned up its invocation in Spark code 8. Updated `dev/deps/spark-deps-hadoop-3-hive-2.3` and `LICENSE-binary` 9. Removed task `scala-213` from `build_and_test.yml` because the daily test of other branches will not run this task, and the master branch already uses Scala 2.13 by default. 10. Replaced Scala 2.12 with Scala 2.13 in the `name` of the following `.yml` files: `build_ansi.yml`, `build_coverage.yml`, `build_java11.yml`,`build_java17.yml`, `build_java21.yml`,`build_maven.yml`, and `build_rockdb_as_ui_backend.yml` 11. Removed the support for Scala 2.12 in `benchmark.yml` 12. Moved files from `src/scala-2.13/` directory to `src/main/scala` and deleted all files in the `src/scala-2.12/` directory. 13. Comment out the code related to multiple Scala versions in `load-spark-env.cmd` and `load-spark-env.sh`. 14. Clean up redundant `build-helper-maven-plugin` configurations from `core/pom.xml`, `repl/pom.xml`, `sql/api/pom.xml`, `sql/catalyst/pom.xml`, `sql/core/pom.xml` and `connector/connect/common/pom.xml` ### Why are the changes needed? The minimum supported Scala version for Apache Spark 4.0 is Scala 2.13. ### Does this PR introduce _any_ user-facing change? Yes, Apache will no longer support Scala 2.12 ### How was this patch tested? Pass GitHub Actions ### Was this patch authored or co-authored using generative AI tooling? No Closes #43008 from LuciferYang/SPARK-44113. Lead-authored-by: yangjie01 <[email protected]> Co-authored-by: YangJie <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
- Loading branch information
1 parent
db02469
commit 3429202
Showing
96 changed files
with
326 additions
and
1,849 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -86,7 +86,7 @@ jobs: | |
sparkr=`./dev/is-changed.py -m sparkr` | ||
tpcds=`./dev/is-changed.py -m sql` | ||
docker=`./dev/is-changed.py -m docker-integration-tests` | ||
# 'build', 'scala-213', and 'java-other-versions' are always true for now. | ||
# 'build' and 'java-other-versions' are always true for now. | ||
# It does not save significant time and most of PRs trigger the build. | ||
precondition=" | ||
{ | ||
|
@@ -95,7 +95,6 @@ jobs: | |
\"sparkr\": \"$sparkr\", | ||
\"tpcds-1g\": \"$tpcds\", | ||
\"docker-integration-tests\": \"$docker\", | ||
\"scala-213\": \"true\", | ||
\"java-other-versions\": \"true\", | ||
\"lint\" : \"true\", | ||
\"k8s-integration-tests\" : \"true\", | ||
|
@@ -828,53 +827,6 @@ jobs: | |
./build/mvn $MAVEN_CLI_OPTS -DskipTests -Pyarn -Pmesos -Pkubernetes -Pvolcano -Phive -Phive-thriftserver -Phadoop-cloud -Djava.version=${JAVA_VERSION/-ea} install | ||
rm -rf ~/.m2/repository/org/apache/spark | ||
scala-213: | ||
needs: precondition | ||
if: fromJson(needs.precondition.outputs.required).scala-213 == 'true' | ||
name: Scala 2.13 build with SBT | ||
runs-on: ubuntu-22.04 | ||
timeout-minutes: 300 | ||
steps: | ||
- name: Checkout Spark repository | ||
uses: actions/checkout@v3 | ||
with: | ||
fetch-depth: 0 | ||
repository: apache/spark | ||
ref: ${{ inputs.branch }} | ||
- name: Sync the current branch with the latest in Apache Spark | ||
if: github.repository != 'apache/spark' | ||
run: | | ||
git fetch https://github.com/$GITHUB_REPOSITORY.git ${GITHUB_REF#refs/heads/} | ||
git -c user.name='Apache Spark Test Account' -c user.email='[email protected]' merge --no-commit --progress --squash FETCH_HEAD | ||
git -c user.name='Apache Spark Test Account' -c user.email='[email protected]' commit -m "Merged commit" --allow-empty | ||
- name: Cache Scala, SBT and Maven | ||
uses: actions/cache@v3 | ||
with: | ||
path: | | ||
build/apache-maven-* | ||
build/scala-* | ||
build/*.jar | ||
~/.sbt | ||
key: build-${{ hashFiles('**/pom.xml', 'project/build.properties', 'build/mvn', 'build/sbt', 'build/sbt-launch-lib.bash', 'build/spark-build-info') }} | ||
restore-keys: | | ||
build- | ||
- name: Cache Coursier local repository | ||
uses: actions/cache@v3 | ||
with: | ||
path: ~/.cache/coursier | ||
key: scala-213-coursier-${{ hashFiles('**/pom.xml', '**/plugins.sbt') }} | ||
restore-keys: | | ||
scala-213-coursier- | ||
- name: Install Java 8 | ||
uses: actions/setup-java@v3 | ||
with: | ||
distribution: zulu | ||
java-version: 8 | ||
- name: Build with SBT | ||
run: | | ||
./dev/change-scala-version.sh 2.13 | ||
./build/sbt -Pyarn -Pmesos -Pkubernetes -Pvolcano -Phive -Phive-thriftserver -Phadoop-cloud -Pkinesis-asl -Pdocker-integration-tests -Pkubernetes-integration-tests -Pspark-ganglia-lgpl -Pscala-2.13 compile Test/compile | ||
# Any TPC-DS related updates on this job need to be applied to tpcds-1g-gen job of benchmark.yml as well | ||
tpcds-1g: | ||
needs: precondition | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.