-
Notifications
You must be signed in to change notification settings - Fork 28.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-22950][SQL]Handle ChildFirstURLClassLoader's parent #20145
Conversation
Test build #85653 has finished for PR 20145 at commit
|
Test build #85654 has finished for PR 20145 at commit
|
Thread.currentThread().setContextClassLoader(loader) | ||
intercept[IllegalArgumentException]( | ||
HiveUtils.newClientForMetadata(conf, SparkHadoopUtil.newConfiguration(conf))) | ||
Thread.currentThread().setContextClassLoader(contextClassLoader) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we should use try .. finally
to make sure we always set back the context class loader
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
val loader = new ChildFirstURLClassLoader(Array(), contextClassLoader) | ||
Thread.currentThread().setContextClassLoader(loader) | ||
HiveUtils.newClientForMetadata(conf, SparkHadoopUtil.newConfiguration(conf)) | ||
Thread.currentThread().setContextClassLoader(contextClassLoader) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ditto
/** | ||
* A Fake [[ChildFirstURLClassLoader]] used for test | ||
*/ | ||
private[spark] class FakeChildFirstURLClassLoader(urls: Array[URL], parent: ClassLoader) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not a ChildFirstURLClassLoader
, shall we make it extend ChildFirstURLClassLoader
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
inheritation maketh case match
@@ -42,4 +47,41 @@ class HiveUtilsSuite extends QueryTest with SQLTestUtils with TestHiveSingleton | |||
assert(hiveConf("foo") === "bar") | |||
} | |||
} | |||
|
|||
test("ChildFirstURLClassLoader's parent is null") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what is this test trying to test?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just indicates that if we don't handle child first classloader's parent, we will get caught in exception. if it bothers, i can del it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we can remove it. The below tests case would fail without your PR, that proves your patch already.
Test build #85666 has finished for PR 20145 at commit
|
Test build #85663 has finished for PR 20145 at commit
|
retest this please |
Test build #85670 has finished for PR 20145 at commit
|
thanks, merging to master/2.3! |
## What changes were proposed in this pull request? ChildFirstClassLoader's parent is set to null, so we can't get jars from its parent. This will cause ClassNotFoundException during HiveClient initialization with builtin hive jars, where we may should use spark context loader instead. ## How was this patch tested? add new ut cc cloud-fan gatorsmile Author: Kent Yao <[email protected]> Closes #20145 from yaooqinn/SPARK-22950. (cherry picked from commit 9fa703e) Signed-off-by: Wenchen Fan <[email protected]>
What changes were proposed in this pull request?
ChildFirstClassLoader's parent is set to null, so we can't get jars from its parent. This will cause ClassNotFoundException during HiveClient initialization with builtin hive jars, where we may should use spark context loader instead.
How was this patch tested?
add new ut
cc @cloud-fan @gatorsmile