Skip to content

Commit

Permalink
GH-38060: [Python][CI] Upgrade Spark versions (#38082)
Browse files Browse the repository at this point in the history
### Rationale for this change

Spark released 3.5.0 and dropped support for Java 8 and 11 on main.

### What changes are included in this PR?

* Upgrade Spark integration tests from v3.4.1 -> 3.5.0
* Use JDK 17 for Spark latest

### Are these changes tested?

Will test in CI

### Are there any user-facing changes?

No
* Closes: #38060

Authored-by: Dane Pitkin <[email protected]>
Signed-off-by: Sutou Kouhei <[email protected]>
  • Loading branch information
danepitkin authored Oct 6, 2023
1 parent f525b99 commit 3697bcd
Showing 1 changed file with 4 additions and 3 deletions.
7 changes: 4 additions & 3 deletions dev/tasks/tasks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -1611,9 +1611,9 @@ tasks:
image: conda-python-hdfs
{% endfor %}

{% for python_version, spark_version, test_pyarrow_only, numpy_version in [("3.8", "v3.4.1", "false", "latest"),
("3.10", "v3.4.1", "false", "1.23"),
("3.11", "master", "false", "latest")] %}
{% for python_version, spark_version, test_pyarrow_only, numpy_version, jdk_version in [("3.8", "v3.5.0", "false", "latest", "8"),
("3.10", "v3.5.0", "false", "1.23", "8"),
("3.11", "master", "false", "latest", "17")] %}
test-conda-python-{{ python_version }}-spark-{{ spark_version }}:
ci: github
template: docker-tests/github.linux.yml
Expand All @@ -1623,6 +1623,7 @@ tasks:
SPARK: "{{ spark_version }}"
TEST_PYARROW_ONLY: "{{ test_pyarrow_only }}"
NUMPY: "{{ numpy_version }}"
JDK: "{{ jdk_version }}"
# use the branch-3.0 of spark, so prevent reusing any layers
flags: --no-leaf-cache
image: conda-python-spark
Expand Down

0 comments on commit 3697bcd

Please sign in to comment.