-
Notifications
You must be signed in to change notification settings - Fork 28.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-20401][DOC]In the spark official configuration document, the 'spark.driver.supervise' configuration parameter specification and default values are necessary. #17696
Conversation
…ucceeded|failed|unknown]
…remove redundant description.
….driver.supervise' configuration parameter specification and default values are necessary.
Not a big deal. Waht do you think about wrapping the command you ran with
I think this makes the PR description more readable. |
@HyukjinKwon |
I will probably test it by myself later. Thanks for taking my comment into account. |
Please cut a picture, I see how you show on your computer? |
Yup, I will let you know when I make it. I think this does not block this PR BTW :). |
Test build #3666 has finished for PR 17696 at commit
|
docs/configuration.md
Outdated
<td><code>spark.driver.supervise</code></td> | ||
<td>false</td> | ||
<td> | ||
If value set true, restarts the driver on failure, make sure that |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This sentence needs to be simplified. "If true, restarts the driver automatically if it fails with a non-zero exit status."
@guoxiaolongzte, I just tested the backquotes on Windows 7 with Chrome as below: It is a backquote |
docs/configuration.md
Outdated
<td> | ||
If value set true, restarts the driver on failure, make sure that | ||
the driver is automatically restarted if it fails with non-zero exit code. | ||
Only in Spark standalone or Mesos with cluster deploy mode. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am sorry for asking such a question @srowen. Is it the right sentence in English?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It'd be better to say "Only has effect in Spark standalone mode or Mesos cluster deploy mode."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK,I will fix it.Thank you.
@HyukjinKwon |
Test build #3671 has finished for PR 17696 at commit
|
Merged to master/2.2 |
…'spark.driver.supervise' configuration parameter specification and default values are necessary. ## What changes were proposed in this pull request? Use the REST interface submits the spark job. e.g. curl -X POST http://10.43.183.120:6066/v1/submissions/create --header "Content-Type:application/json;charset=UTF-8" --data'{ "action": "CreateSubmissionRequest", "appArgs": [ "myAppArgument" ], "appResource": "/home/mr/gxl/test.jar", "clientSparkVersion": "2.2.0", "environmentVariables": { "SPARK_ENV_LOADED": "1" }, "mainClass": "cn.zte.HdfsTest", "sparkProperties": { "spark.jars": "/home/mr/gxl/test.jar", **"spark.driver.supervise": "true",** "spark.app.name": "HdfsTest", "spark.eventLog.enabled": "false", "spark.submit.deployMode": "cluster", "spark.master": "spark://10.43.183.120:6066" } }' **I hope that make sure that the driver is automatically restarted if it fails with non-zero exit code. But I can not find the 'spark.driver.supervise' configuration parameter specification and default values from the spark official document.** ## How was this patch tested? manual tests Please review http://spark.apache.org/contributing.html before opening a pull request. Author: 郭小龙 10207633 <[email protected]> Author: guoxiaolong <[email protected]> Author: guoxiaolongzte <[email protected]> Closes #17696 from guoxiaolongzte/SPARK-20401. (cherry picked from commit ad29040) Signed-off-by: Sean Owen <[email protected]>
…'spark.driver.supervise' configuration parameter specification and default values are necessary. ## What changes were proposed in this pull request? Use the REST interface submits the spark job. e.g. curl -X POST http://10.43.183.120:6066/v1/submissions/create --header "Content-Type:application/json;charset=UTF-8" --data'{ "action": "CreateSubmissionRequest", "appArgs": [ "myAppArgument" ], "appResource": "/home/mr/gxl/test.jar", "clientSparkVersion": "2.2.0", "environmentVariables": { "SPARK_ENV_LOADED": "1" }, "mainClass": "cn.zte.HdfsTest", "sparkProperties": { "spark.jars": "/home/mr/gxl/test.jar", **"spark.driver.supervise": "true",** "spark.app.name": "HdfsTest", "spark.eventLog.enabled": "false", "spark.submit.deployMode": "cluster", "spark.master": "spark://10.43.183.120:6066" } }' **I hope that make sure that the driver is automatically restarted if it fails with non-zero exit code. But I can not find the 'spark.driver.supervise' configuration parameter specification and default values from the spark official document.** ## How was this patch tested? manual tests Please review http://spark.apache.org/contributing.html before opening a pull request. Author: 郭小龙 10207633 <[email protected]> Author: guoxiaolong <[email protected]> Author: guoxiaolongzte <[email protected]> Closes apache#17696 from guoxiaolongzte/SPARK-20401.
What changes were proposed in this pull request?
Use the REST interface submits the spark job.
e.g.
curl -X POST http://10.43.183.120:6066/v1/submissions/create --header "Content-Type:application/json;charset=UTF-8" --data'{
"action": "CreateSubmissionRequest",
"appArgs": [
"myAppArgument"
],
"appResource": "/home/mr/gxl/test.jar",
"clientSparkVersion": "2.2.0",
"environmentVariables": {
"SPARK_ENV_LOADED": "1"
},
"mainClass": "cn.zte.HdfsTest",
"sparkProperties": {
"spark.jars": "/home/mr/gxl/test.jar",
"spark.driver.supervise": "true",
"spark.app.name": "HdfsTest",
"spark.eventLog.enabled": "false",
"spark.submit.deployMode": "cluster",
"spark.master": "spark://10.43.183.120:6066"
}
}'
I hope that make sure that the driver is automatically restarted if it fails with non-zero exit code.
But I can not find the 'spark.driver.supervise' configuration parameter specification and default values from the spark official document.
How was this patch tested?
manual tests
Please review http://spark.apache.org/contributing.html before opening a pull request.