-
Notifications
You must be signed in to change notification settings - Fork 2.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ZEPPELIN-256 Self diagnosis spark configuration #246
Conversation
This branch rebased #244 |
9fb62ef
to
3d77ede
Compare
Ready to be reviewed and merge. |
} | ||
|
||
if (!pysparkFound) { | ||
error += "pyspark.zip or SPARK_HOME/python directory is not found"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i think it should be something like xxxxxxx not found.
That would be very helpful during Zeppelin set up! LGTM |
@@ -329,6 +358,23 @@ public static String getSystemDefault( | |||
|
|||
@Override | |||
public void open() { | |||
if (diagnosis()) { | |||
sparkConfValidator = new SparkConfValidator( | |||
System.getenv("SPARK_HOME"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we get SPARK_HOME like above (line 303):
getSystemDefault("SPARK_HOME", "spark.home", null);
1b714db
to
e00f0f1
Compare
Let's bring to up-to-date? |
add view only mode
@Leemoonsoo Any plan on finishing this one? |
close #83 close #86 close #125 close #133 close #139 close #146 close #193 close #203 close #246 close #262 close #264 close #273 close #291 close #299 close #320 close #347 close #389 close #413 close #423 close #543 close #560 close #658 close #670 close #728 close #765 close #777 close #782 close #783 close #812 close #822 close #841 close #843 close #878 close #884 close #918 close #989 close #1076 close #1135 close #1187 close #1231 close #1304 close #1316 close #1361 close #1385 close #1390 close #1414 close #1422 close #1425 close #1447 close #1458 close #1466 close #1485 close #1492 close #1495 close #1497 close #1536 close #1545 close #1561 close #1577 close #1600 close #1603 close #1678 close #1695 close #1739 close #1748 close #1765 close #1767 close #1776 close #1783 close #1799
https://issues.apache.org/jira/browse/ZEPPELIN-256
This patch adds capability of self diagnosis of configuration for spark interpreter.
User can set "zeppelin.spark.diagnosis" property "true" to enable this feature, in Interpreter page.
When it is turned on, SparkConfValidator checks validity of combination of environment variables and properties. It checks actual existence of specified directories (eg. SPARK_HOME, etc) and files (eg. pyspark.zip), according to given configuration.
When it detects some problem, it prints error as a result message in a notebook screen. So user can see what was missed.
Later, SparkConfValidator is proven to have no false positive, we can turn this feature on by default.