-
Notifications
You must be signed in to change notification settings - Fork 75
[NSE-382]Fix Hadoop version issue #388
[NSE-382]Fix Hadoop version issue #388
Conversation
the failure seems due to ${hadoop.version} is not passed from parent pom if compile core only |
<properties> | ||
<scala.version>2.12.10</scala.version> | ||
<scala.binary.version>2.12</scala.binary.version> | ||
<spark.version>3.1.1</spark.version> | ||
<arrow.version>4.0.0</arrow.version> | ||
<arrow-memory.artifact>arrow-memory-netty</arrow-memory.artifact> | ||
<hadoop.version>2.7.4</hadoop.version> | ||
<hadoop.version>${hadoop.version}</hadoop.version> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks a bit odd to me, this should be the same as original way?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not the same, the issue I found is ${hadoop.version} in arrow datasource and columnar plugin didn't catch the value from parent.
The solution can help to deliver ${hadoop.version} to both sub-projects.
My testing is to use mvn help:evaluate to confirm the difference.
The original way shows ${hadoop.version} as 2.7.4 when enabling -Phadoop-3.2.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tried the same cmd here, my hadoop version is correct
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also add the maven version limit here?
https://github.com/apache/spark/blob/master/pom.xml#L118
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, we can do this by using the same maven plugin.
Let me add it and submit a new PR.
I can reproduce the error when adding -Pfull-scala-compiler, if remove the parameter, the compile can run successfully. |
For NullPointer issue, There is a potential issue for arrow-memory.artifact setting and we fix it in the PR as well. |
What changes were proposed in this pull request?
How was this patch tested?
Tested in local server.