Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[#3555] feat(spark-connector): support multi scala version #3608

Merged
merged 7 commits into from
May 29, 2024

Conversation

FANNG1
Copy link
Contributor

@FANNG1 FANNG1 commented May 28, 2024

What changes were proposed in this pull request?

support multi scala version

Why are the changes needed?

Fix: #3555

Does this PR introduce any user-facing change?

no

How was this patch tested?

  1. existing IT and UT
  2. in my local machine, tested spark sqls on spark3.4-2.13 and spark3.5-2.13

@FANNG1 FANNG1 marked this pull request as draft May 28, 2024 07:44
@FANNG1 FANNG1 changed the title [SIP] support multi scala [#3555] feat(spark-connector): support multi scala version May 28, 2024
@@ -57,6 +57,7 @@ jobs:
matrix:
architecture: [linux/amd64]
java-version: [ 8, 11, 17 ]
scala-version: [ 2.12, 2.13 ]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will change to 2.12 when every thing is ready to merge

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think 2.12 only should be enough, otherwise, we will have too many CI pipeline.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

@@ -239,6 +239,7 @@ public org.apache.spark.sql.connector.expressions.Transform[] toSparkTransform(
return sparkTransforms.toArray(new org.apache.spark.sql.connector.expressions.Transform[0]);
}

@SuppressWarnings("deprecation")
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

JavaConverters.seqAsJavaList is deprecated in scala 2.13

@FANNG1 FANNG1 marked this pull request as ready for review May 28, 2024 13:57
@FANNG1 FANNG1 requested a review from jerryshao May 28, 2024 13:57
@FANNG1
Copy link
Contributor Author

FANNG1 commented May 28, 2024

@jerryshao @qqqttt123 please help to review when you have time

@jerryshao
Copy link
Contributor

Generally LGTM, @FANNG1 can you address one comment I mentioned above.

jerryshao
jerryshao previously approved these changes May 29, 2024
./gradlew --rerun-tasks -PskipTests -PtestMode=${{ matrix.test-mode }} -PjdkVersion=${{ matrix.java-version }} :spark-connector:spark-3.3:test --tests "com.datastrato.gravitino.spark.connector.integration.test.**"
./gradlew --rerun-tasks -PskipTests -PtestMode=${{ matrix.test-mode }} -PjdkVersion=${{ matrix.java-version }} :spark-connector:spark-3.4:test --tests "com.datastrato.gravitino.spark.connector.integration.test.**"
./gradlew --rerun-tasks -PskipTests -PtestMode=${{ matrix.test-mode }} -PjdkVersion=${{ matrix.java-version }} :spark-connector:spark-3.5:test --tests "com.datastrato.gravitino.spark.connector.integration.test.**"
if [ "${{ matrix.scala-version }}" == "2.12" ];then
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if only support 2.12 in ci, seems unnecessary to add this check.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I prefer to keep this, maybe we need to support 2.13 latter

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok.

@FANNG1
Copy link
Contributor Author

FANNG1 commented May 29, 2024

ci failed for HDFSKerberosIT, retrigger it

@@ -8,6 +8,10 @@ plugins {

rootProject.name = "gravitino"

val scalaVersion: String = gradle.startParameter.projectProperties["scalaVersion"]?.toString()
?: settings.extra["defaultScalaVersion"]?.toString()
?: "2.12"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not just define defaultScalaVersion directly in the settings file, it seems unnecessary to hardcode a 2.12 here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

defaultScalaVersion is defined in gradle.properties, removed 2.12

@FANNG1
Copy link
Contributor Author

FANNG1 commented May 29, 2024

@caican00 comments are addressed, please help to review again

@caican00
Copy link
Collaborator

@caican00 comments are addressed, please help to review again

LGTM from my side.

@FANNG1
Copy link
Contributor Author

FANNG1 commented May 29, 2024

@jerryshao do you have any comments?

@jerryshao
Copy link
Contributor

@jerryshao do you have any comments?

I don't have further comment.

@jerryshao jerryshao merged commit d24b437 into apache:main May 29, 2024
33 checks passed
github-actions bot pushed a commit that referenced this pull request May 29, 2024
### What changes were proposed in this pull request?
support multi scala version 

### Why are the changes needed?

Fix: #3555 

### Does this PR introduce _any_ user-facing change?
no

### How was this patch tested?
1. existing IT and UT
2. in my local machine, tested spark sqls on spark3.4-2.13 and
spark3.5-2.13
diqiu50 pushed a commit to diqiu50/gravitino that referenced this pull request Jun 13, 2024
…che#3608)

### What changes were proposed in this pull request?
support multi scala version 

### Why are the changes needed?

Fix: apache#3555 

### Does this PR introduce _any_ user-facing change?
no

### How was this patch tested?
1. existing IT and UT
2. in my local machine, tested spark sqls on spark3.4-2.13 and
spark3.5-2.13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Subtask] support multi scala version
3 participants