-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[SPARK-39620][WEB UI] Use same condition in history server page and A…
…PI to filter applications ### What changes were proposed in this pull request? Updated REST API `/api/v1/applications`, to use the same condition as history server page to filter completed/incomplete applications. ### Why are the changes needed? When opening summary page, history server follows this logic: - If there's completed/incomplete application, page will add script in response, using AJAX to call the REST API to get the filtered list. - If there's no such application, page will only return a message telling nothing found. Issue is that page and REST API are using different conditions to filter applications. In `HistoryPage`, an application is considered as completed as long as the last attempt is completed. But in `ApplicationListResource`, all attempts should be completed. This brings inconsistency and will cause issue in a corner case. In driver, event queues have capacity to protect memory. When there's too many events, some of them will be dropped and the event log file will be incomplete. For an application with multiple attempts, there's possibility that the last attempt is completed, but the previous attempts is considered as incomplete due to loss of application end event. For this type of application, page thinks it is completed, but the API thinks it is still running. When opening summary page: - When checking completed applications, page will call script, but API returns nothing. - When checking incomplete applications, page returns nothing. So the user won't be able to see this app in history server. ### Does this PR introduce _any_ user-facing change? Yes, there will be a change on `/api/v1/applications` API and history server summary page. When calling API, for application mentioned above, previously it is considered as running. After the change it is considered as completed. So the result will be different using same filter. But this change should be OK. Because attempts are executed sequentially and incrementally. So if an attempt with bigger ID is completed, the previous attempts can be considered as completed. For history server summary page, previously user is not able to see the application. Now it will appear in the completed applications. ### How was this patch tested? Add a new unit test `HistoryServerPageSuite`, which will check whether `HistoryPage` behaves the same as `ApplicationListResource` when filtering applications. To implement the test, there's a minor change of `HistoryPage`, exposing a method called `shouldDisplayApplications` to tell whether the summary page will display applications. The test verifies that: - If no completed/incomplete application found, `HistoryPage` should not display applications, and API should return an empty list. - Otherwise, `HistoryPage` should display applications, and API should return a non-empty list. Currently 2 scenarios are included: - Application with last attempt completed but previous attempt incomplete. - Application with last attempt incomplete but previous attempt completed. Closes #37008 from kuwii/kuwii/hs-fix. Authored-by: kuwii <[email protected]> Signed-off-by: Sean Owen <[email protected]>
- Loading branch information
Showing
29 changed files
with
686 additions
and
315 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
10 changes: 10 additions & 0 deletions
10
...st/resources/spark-events-broken/last-attempt-incomplete/application_1656321732247_0006_1
Large diffs are not rendered by default.
Oops, something went wrong.
9 changes: 9 additions & 0 deletions
9
...st/resources/spark-events-broken/last-attempt-incomplete/application_1656321732247_0006_2
Large diffs are not rendered by default.
Oops, something went wrong.
9 changes: 9 additions & 0 deletions
9
...esources/spark-events-broken/previous-attempt-incomplete/application_1656321732247_0006_1
Large diffs are not rendered by default.
Oops, something went wrong.
10 changes: 10 additions & 0 deletions
10
...esources/spark-events-broken/previous-attempt-incomplete/application_1656321732247_0006_2
Large diffs are not rendered by default.
Oops, something went wrong.
103 changes: 103 additions & 0 deletions
103
core/src/test/scala/org/apache/spark/deploy/history/HistoryServerPageSuite.scala
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,103 @@ | ||
/* | ||
* Licensed to the Apache Software Foundation (ASF) under one or more | ||
* contributor license agreements. See the NOTICE file distributed with | ||
* this work for additional information regarding copyright ownership. | ||
* The ASF licenses this file to You under the Apache License, Version 2.0 | ||
* (the "License"); you may not use this file except in compliance with | ||
* the License. You may obtain a copy of the License at | ||
* | ||
* http://www.apache.org/licenses/LICENSE-2.0 | ||
* | ||
* Unless required by applicable law or agreed to in writing, software | ||
* distributed under the License is distributed on an "AS IS" BASIS, | ||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
* See the License for the specific language governing permissions and | ||
* limitations under the License. | ||
*/ | ||
|
||
package org.apache.spark.deploy.history | ||
|
||
import java.net.URL | ||
import javax.servlet.http.HttpServletResponse | ||
|
||
import org.json4s.DefaultFormats | ||
import org.json4s.JsonAST._ | ||
import org.json4s.jackson.JsonMethods.parse | ||
import org.scalatest.BeforeAndAfter | ||
|
||
import org.apache.spark.{SparkConf, SparkFunSuite} | ||
import org.apache.spark.internal.config.History._ | ||
import org.apache.spark.internal.config.Tests._ | ||
import org.apache.spark.status.api.v1.ApplicationStatus | ||
import org.apache.spark.util.Utils | ||
|
||
class HistoryServerPageSuite extends SparkFunSuite with BeforeAndAfter { | ||
private implicit val format: DefaultFormats.type = DefaultFormats | ||
|
||
private val logDirs = Seq( | ||
getTestResourcePath("spark-events-broken/previous-attempt-incomplete"), | ||
getTestResourcePath("spark-events-broken/last-attempt-incomplete") | ||
) | ||
|
||
private var server: Option[HistoryServer] = None | ||
private val localhost: String = Utils.localHostNameForURI() | ||
private var port: Int = -1 | ||
|
||
private def startHistoryServer(logDir: String): Unit = { | ||
assert(server.isEmpty) | ||
val conf = new SparkConf() | ||
.set(HISTORY_LOG_DIR, logDir) | ||
.set(UPDATE_INTERVAL_S.key, "0") | ||
.set(IS_TESTING, true) | ||
val provider = new FsHistoryProvider(conf) | ||
provider.checkForLogs() | ||
val securityManager = HistoryServer.createSecurityManager(conf) | ||
val _server = new HistoryServer(conf, provider, securityManager, 18080) | ||
_server.bind() | ||
provider.start() | ||
server = Some(_server) | ||
port = _server.boundPort | ||
} | ||
|
||
private def stopHistoryServer(): Unit = { | ||
server.foreach(_.stop()) | ||
server = None | ||
} | ||
|
||
private def callApplicationsAPI(requestedIncomplete: Boolean): Seq[JObject] = { | ||
val param = if (requestedIncomplete) { | ||
ApplicationStatus.RUNNING.toString.toLowerCase() | ||
} else { | ||
ApplicationStatus.COMPLETED.toString.toLowerCase() | ||
} | ||
val (code, jsonOpt, errOpt) = HistoryServerSuite.getContentAndCode( | ||
new URL(s"http://$localhost:$port/api/v1/applications?status=$param") | ||
) | ||
assert(code == HttpServletResponse.SC_OK) | ||
assert(jsonOpt.isDefined) | ||
assert(errOpt.isEmpty) | ||
val json = parse(jsonOpt.get).extract[List[JObject]] | ||
json | ||
} | ||
|
||
override def afterEach(): Unit = { | ||
super.afterEach() | ||
stopHistoryServer() | ||
} | ||
|
||
test("SPARK-39620: should behaves the same as REST API when filtering applications") { | ||
logDirs.foreach { logDir => | ||
startHistoryServer(logDir) | ||
val page = new HistoryPage(server.get) | ||
Seq(true, false).foreach { requestedIncomplete => | ||
val apiResponse = callApplicationsAPI(requestedIncomplete) | ||
if (page.shouldDisplayApplications(requestedIncomplete)) { | ||
assert(apiResponse.nonEmpty) | ||
} else { | ||
assert(apiResponse.isEmpty) | ||
} | ||
} | ||
stopHistoryServer() | ||
} | ||
} | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -138,3 +138,4 @@ over10k | |
exported_table/* | ||
ansible-for-test-node/* | ||
node_modules | ||
spark-events-broken/* |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
62 changes: 62 additions & 0 deletions
62
sql/catalyst/src/main/java/org/apache/spark/sql/connector/expressions/Extract.java
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,62 @@ | ||
/* | ||
* Licensed to the Apache Software Foundation (ASF) under one or more | ||
* contributor license agreements. See the NOTICE file distributed with | ||
* this work for additional information regarding copyright ownership. | ||
* The ASF licenses this file to You under the Apache License, Version 2.0 | ||
* (the "License"); you may not use this file except in compliance with | ||
* the License. You may obtain a copy of the License at | ||
* | ||
* http://www.apache.org/licenses/LICENSE-2.0 | ||
* | ||
* Unless required by applicable law or agreed to in writing, software | ||
* distributed under the License is distributed on an "AS IS" BASIS, | ||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
* See the License for the specific language governing permissions and | ||
* limitations under the License. | ||
*/ | ||
|
||
package org.apache.spark.sql.connector.expressions; | ||
|
||
import org.apache.spark.annotation.Evolving; | ||
|
||
import java.io.Serializable; | ||
|
||
/** | ||
* Represent an extract function, which extracts and returns the value of a | ||
* specified datetime field from a datetime or interval value expression. | ||
* <p> | ||
* The currently supported fields names following the ISO standard: | ||
* <ol> | ||
* <li> <code>SECOND</code> Since 3.4.0 </li> | ||
* <li> <code>MINUTE</code> Since 3.4.0 </li> | ||
* <li> <code>HOUR</code> Since 3.4.0 </li> | ||
* <li> <code>MONTH</code> Since 3.4.0 </li> | ||
* <li> <code>QUARTER</code> Since 3.4.0 </li> | ||
* <li> <code>YEAR</code> Since 3.4.0 </li> | ||
* <li> <code>DAY_OF_WEEK</code> Since 3.4.0 </li> | ||
* <li> <code>DAY</code> Since 3.4.0 </li> | ||
* <li> <code>DAY_OF_YEAR</code> Since 3.4.0 </li> | ||
* <li> <code>WEEK</code> Since 3.4.0 </li> | ||
* <li> <code>YEAR_OF_WEEK</code> Since 3.4.0 </li> | ||
* </ol> | ||
* | ||
* @since 3.4.0 | ||
*/ | ||
|
||
@Evolving | ||
public class Extract implements Expression, Serializable { | ||
|
||
private String field; | ||
private Expression source; | ||
|
||
public Extract(String field, Expression source) { | ||
this.field = field; | ||
this.source = source; | ||
} | ||
|
||
public String field() { return field; } | ||
public Expression source() { return source; } | ||
|
||
@Override | ||
public Expression[] children() { return new Expression[]{ source() }; } | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.