-
Notifications
You must be signed in to change notification settings - Fork 28.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-10427][SQL]Respect -S option for spark-sql #8915
Conversation
@@ -174,8 +174,6 @@ private[hive] class IsolatedClientLoader( | |||
|
|||
// Pre-reflective instantiation setup. | |||
logDebug("Initializing the logger to avoid disaster...") | |||
Thread.currentThread.setContextClassLoader(classLoader) | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is redundant here since the contexClasstloader
would be reset correctly within the constructor of ClientWrapper
, and setting the contextClassLoader too early here would prevent us to retrieve the other classLoader within the same thread which having SessionState
containing the user input option.
@@ -174,8 +174,6 @@ private[hive] class IsolatedClientLoader( | |||
|
|||
// Pre-reflective instantiation setup. | |||
logDebug("Initializing the logger to avoid disaster...") | |||
Thread.currentThread.setContextClassLoader(classLoader) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This probably causes ClassNotFoundException or ClassCastException if we removed this line, as we do need to load the specified version of Hive Metastore jars and its dependencies.
Test build #43011 has finished for PR 8915 at commit
|
Test build #43014 has finished for PR 8915 at commit
|
@zhichao-li can you rebase to master? |
4596424
to
7844933
Compare
@andrewor14 @liancheng , These options like "-S" "-v" still be discarded since they are all stored within SessionState which would be reconstructed by HiveContext while creating ClientWrapper. The logic flow kind of like: Somehow we need to pass the parameters from sparkClassLoader to MutableURLClassLoader. |
Test build #47988 has finished for PR 8915 at commit
|
506cc13
to
7844933
Compare
Test build #47992 has finished for PR 8915 at commit
|
Test build #47994 has finished for PR 8915 at commit
|
@zhichao-li My feeling is that it's probably not worth so much effort for such a tiny feature... |
@@ -263,6 +263,7 @@ private[hive] object SparkSQLCLIDriver extends Logging { | |||
|
|||
} | |||
|
|||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
revert the change
@liancheng some of our customer complaints that Spark SQL produces too many logs in their cronjob when immigrate the existing scripts from the legacy product. As Hive will suppress the log when given the option |
Test build #48148 has finished for PR 8915 at commit
|
Test build #48149 has finished for PR 8915 at commit
|
Test build #48152 has finished for PR 8915 at commit
|
retest this please. |
Test build #48160 has finished for PR 8915 at commit
|
retest this please. |
Test build #48173 has finished for PR 8915 at commit
|
retest this please. |
Test build #48295 has finished for PR 8915 at commit
|
Thanks for the pull request. I'm going through a list of pull requests to cut them down since the sheer number is breaking some of the tooling we have. Due to lack of activity on this pull request, I'm going to push a commit to close it. Feel free to reopen it or create a new one. |
"-S" option is ignored by the conf setting logic at the moment, we need to pick up this to complete the "silent" function.
Before:
lizhichao@lizhichao-desktop:~/repo/spark/bin$ ./spark-sql -S -e "show databases"
SET hive.support.sql11.reserved.keywords=false
SET spark.sql.hive.version=1.2.1
SET spark.sql.hive.version=1.2.1
OK
default
After:
lizhichao@lizhichao-desktop:~/repo/spark/bin$ ./spark-sql -S -e "show databases"
default