Skip to content

Commit

Permalink
SPARK-1689: Spark application should die when removed by Master
Browse files Browse the repository at this point in the history
scheduler.error() will mask the error if there are active tasks. Being removed is a cataclysmic event for Spark applications, and should probably be treated as such.

Author: Aaron Davidson <[email protected]>

Closes apache#832 from aarondav/i-love-u and squashes the following commits:

9f1200f [Aaron Davidson] SPARK-1689: Spark application should die when removed by Master

Conflicts:
	core/src/main/scala/org/apache/spark/scheduler/cluster/SparkDeploySchedulerBackend.scala
  • Loading branch information
aarondav authored and markhamstra committed May 28, 2014
1 parent ee9b8a4 commit cecf946
Showing 1 changed file with 2 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,8 @@ private[spark] class SparkDeploySchedulerBackend(
if (!stopping) {
logError("Spark cluster looks dead, giving up.")
scheduler.error("Spark cluster looks down")
// Ensure the application terminates, as we can no longer run jobs.
sc.stop()
}
}

Expand Down

0 comments on commit cecf946

Please sign in to comment.