-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GH-36332: [CI][Java] Patch spark to use Netty 4.1.94.Final on our integration tests #36640
Conversation
…ur integration tests
@github-actions crossbow submit spark |
|
Revision: 44ec8f8 Submitted crossbow builds: ursacomputing/crossbow @ actions-e71df5add3
|
This is not going to be enough because from my understanding Spark also pins the versions used on Hadoop here: |
This PR is motivated by #36332 |
LGTM. |
cc @BryanCutler |
@kiszk if I understand correctly you are suggesting to also patch the versions here: https://github.com/apache/spark/blob/master/dev/deps/spark-deps-hadoop-3-hive-2.3#L186-L203 to see if the job changing the versions succeeds (at least on master spark)? |
@raulcd Sorry for confusing you. I am fine with this PR. |
but the spark jobs still fail unless we also patch those. I mean the current change doesn't seem enough to make our CI integration jobs with spark successful as seen on the crossbow report comment here: #36640 (comment) |
I think this just used by dependencies change check(like a golden file), will not affect the actual netty version usage |
@@ -45,6 +45,13 @@ export MAVEN_OPTS="${MAVEN_OPTS} -Dorg.slf4j.simpleLogger.log.org.apache.maven.c | |||
|
|||
pushd ${spark_dir} | |||
|
|||
# Due to CVE-2023-34462 we upgraded to a memory netty version which is incompatible | |||
# with previous spark versions. Patch the pom to use newer version. | |||
sed -i.bak -E -e \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I run this sed
statement locally, but I found that the netty.version
in the pom.xml
still 4.1.93.Final
Maybe we can test build/mvn versions:set-property -Dproperty=netty.version -DnewVersion=4.1.94.Final -DgenerateBackupPoms=false
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, this line may also say to use 4.1.93.final
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, I'll investigate, I might have messed up on my tests :)
I am closing this PR as upgrading to netty 4.1.96 fixed the issue reverting the regression introduced on 4.1.94. |
Rationale for this change
It does seem that the only way to shadow Netty version is to modify the Pom for previous versions.
What changes are included in this PR?
Try to patch version of Netty on the pom when cloning Spark.
Are these changes tested?
Archery integration tests
Are there any user-facing changes?
No