Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-2284][UI] Mark all failed tasks as failures. #1224

Closed
wants to merge 1 commit into from

Conversation

rxin
Copy link
Contributor

@rxin rxin commented Jun 26, 2014

Previously only tasks failed with ExceptionFailure reason was marked as failure.

Previously only tasks failed with ExceptionFailure reason was marked as failure.
@rxin
Copy link
Contributor Author

rxin commented Jun 26, 2014

@pwendell If still possible, this should go into 1.0.1.

(None, Option(taskEnd.taskMetrics))
case e: org.apache.spark.TaskEndReason =>
stageIdToTasksFailed(sid) = stageIdToTasksFailed.getOrElse(sid, 0) + 1
(None, None)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will submit a PR later to consolidate the error reporting (right now we have some pretty convoluted error reporting in the UI).

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished. All automated tests passed.

@AmplabJenkins
Copy link

All automated tests passed.
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16137/

@aarondav
Copy link
Contributor

LGTM, feel free to merge.

@@ -185,12 +185,15 @@ class JobProgressListener(conf: SparkConf) extends SparkListener {

val (failureInfo, metrics): (Option[ExceptionFailure], Option[TaskMetrics]) =
taskEnd.reason match {
case org.apache.spark.Success =>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why'd you make these fully qualified instead of importing the class? is there a naming conflict?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Success is too common ....

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rxin
Copy link
Contributor Author

rxin commented Jun 26, 2014

Ok I'm going to merge this in master & branch-1.0. Thanks for taking a look.

asfgit pushed a commit that referenced this pull request Jun 26, 2014
Previously only tasks failed with ExceptionFailure reason was marked as failure.

Author: Reynold Xin <[email protected]>

Closes #1224 from rxin/SPARK-2284 and squashes the following commits:

be79dbd [Reynold Xin] [SPARK-2284][UI] Mark all failed tasks as failures.

(cherry picked from commit 4a346e2)
Signed-off-by: Reynold Xin <[email protected]>
@asfgit asfgit closed this in 4a346e2 Jun 26, 2014
@rxin rxin deleted the SPARK-2284 branch August 13, 2014 08:02
xiliu82 pushed a commit to xiliu82/spark that referenced this pull request Sep 4, 2014
Previously only tasks failed with ExceptionFailure reason was marked as failure.

Author: Reynold Xin <[email protected]>

Closes apache#1224 from rxin/SPARK-2284 and squashes the following commits:

be79dbd [Reynold Xin] [SPARK-2284][UI] Mark all failed tasks as failures.
wangyum added a commit that referenced this pull request May 26, 2023
…EWED_PARTITION_THRESHOLD in HandleOuterJoinBuildSideSkew (#1224)

* [CARMEL-6542] Change ADVISORY_PARTITION_SIZE_IN_BYTES to SKEW_JOIN_SKEWED_PARTITION_THRESHOLD in HandleOuterJoinBuildSideSkew

* [CARMEL-6542] Change ADVISORY_PARTITION_SIZE_IN_BYTES to SKEW_JOIN_SKEWED_PARTITION_THRESHOLD in HandleOuterJoinBuildSideSkew

* Update HandleOuterJoinBuildSideSkewSuite.scala
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants