Resolve comments #120
build_main.yml
on: push
Run
/
Check changes
45s
Run
/
Breaking change detection with Buf (branch-3.5)
56s
Run
/
Run TPC-DS queries with SF=1
48m 37s
Run
/
Run Docker integration tests
29m 10s
Run
/
Run Spark on Kubernetes Integration test
1h 14m
Matrix: Run / build
Matrix: Run / java-other-versions
Run
/
Build modules: sparkr
37m 12s
Run
/
Linters, licenses, dependencies and documentation generation
27m 39s
Matrix: Run / pyspark
Annotations
16 errors and 2 warnings
Run / Build modules: streaming, sql-kafka-0-10, streaming-kafka-0-10, yarn, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf
Process completed with exit code 18.
|
Run / Linters, licenses, dependencies and documentation generation
Process completed with exit code 1.
|
Run / Build modules: api, catalyst, hive-thriftserver
Process completed with exit code 18.
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-8568338b9db2f602-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-231cd78b9db40aaf-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$675/0x00007f24345c0460@7090c35e rejected from java.util.concurrent.ThreadPoolExecutor@b8986b2[Shutting down, pool size = 4, active threads = 2, queued tasks = 0, completed tasks = 308]
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$675/0x00007f24345c0460@3618384b rejected from java.util.concurrent.ThreadPoolExecutor@b8986b2[Shutting down, pool size = 3, active threads = 1, queued tasks = 0, completed tasks = 309]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-25a6588b9dc9f807-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-10e8d78b9dcb10df-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-e3b7488b9dcf0da2-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-4a7910de8aa041309d743521dd0cbac2-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-4a7910de8aa041309d743521dd0cbac2-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
UnsafeArrayWriterSuite.SPARK-40403: don't print negative number when array is too big:
UnsafeArrayWriterSuite#L32
org.scalatest.exceptions.TestFailedException: Expected exception org.apache.spark.SparkIllegalArgumentException to be thrown, but java.lang.AssertionError was thrown
|
SparkConnectServiceE2ESuite.SPARK-45133 query should reach FINISHED state when results are not consumed:
SparkConnectServiceE2ESuite#L185
io.grpc.StatusRuntimeException: INTERNAL: java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.sql.connect.service.SparkConnectSessionHolderSuite.beforeAll(SparkConnectSessionHolderSuite.scala:37)
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:69)
org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:321)
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:517)
sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
java.base/java.lang.Thread.run(Thread.java:840)
The currently active SparkContext was created at:
org.apache.spark.sql.connect.service.SparkConnectServiceE2ESuite.beforeAll(SparkConnectServiceE2ESuite.scala:27)
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:69)
org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:321)
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:517)
sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
java.base/java.lang.Thread.run(Thread.java:840)
|
SparkConnectServiceE2ESuite.SPARK-45133 local relation should reach FINISHED state when results are not consumed:
SparkConnectServiceE2ESuite#L197
org.apache.spark.SparkException: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.sql.connect.service.SparkConnectSessionHolderSuite.beforeAll(SparkConnectSessionHolderSuite.scala:37)
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:69)
org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:321)
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:517)
sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
java.base/java.lang.Thread.run(Thread.java:840)
The currently active SparkContext was created at:
org.apache.spark.sql.connect.service.SparkConnectServiceE2ESuite.beforeAll(SparkConnectServiceE2ESuite.scala:27)
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:69)
org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:321)
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:517)
sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
java.base/java.lang.Thread.run(Thread.java:840)
|
Run / Build modules: pyspark-errors
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-core, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
test-results-api, catalyst, hive-thriftserver--17-hadoop3-hive2.3
Expired
|
2.61 MB |
|
test-results-core, unsafe, kvstore, avro, utils, network-common, network-shuffle, repl, launcher, examples, sketch, graphx--17-hadoop3-hive2.3
Expired
|
133 KB |
|
test-results-docker-integration--17-hadoop3-hive2.3
Expired
|
119 KB |
|
test-results-hive-- other tests-17-hadoop3-hive2.3
Expired
|
911 KB |
|
test-results-hive-- slow tests-17-hadoop3-hive2.3
Expired
|
853 KB |
|
test-results-mllib-local,mllib--17-hadoop3-hive2.3
Expired
|
1.31 MB |
|
test-results-pyspark-connect--17-hadoop3-hive2.3
Expired
|
411 KB |
|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3
Expired
|
1.14 MB |
|
test-results-pyspark-pandas--17-hadoop3-hive2.3
Expired
|
1.46 MB |
|
test-results-pyspark-pandas-connect-part0--17-hadoop3-hive2.3
Expired
|
1.32 MB |
|
test-results-pyspark-pandas-connect-part1--17-hadoop3-hive2.3
Expired
|
1.42 MB |
|
test-results-pyspark-pandas-connect-part2--17-hadoop3-hive2.3
Expired
|
953 KB |
|
test-results-pyspark-pandas-connect-part3--17-hadoop3-hive2.3
Expired
|
530 KB |
|
test-results-pyspark-pandas-slow--17-hadoop3-hive2.3
Expired
|
2.86 MB |
|
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3
Expired
|
399 KB |
|
test-results-sparkr--17-hadoop3-hive2.3
Expired
|
280 KB |
|
test-results-sql-- extended tests-17-hadoop3-hive2.3
Expired
|
2.97 MB |
|
test-results-sql-- other tests-17-hadoop3-hive2.3
Expired
|
4.26 MB |
|
test-results-sql-- slow tests-17-hadoop3-hive2.3
Expired
|
2.77 MB |
|
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, yarn, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf--17-hadoop3-hive2.3
Expired
|
287 KB |
|
test-results-tpcds--17-hadoop3-hive2.3
Expired
|
21.8 KB |
|
unit-tests-log-api, catalyst, hive-thriftserver--17-hadoop3-hive2.3
Expired
|
8.39 MB |
|
unit-tests-log-streaming, sql-kafka-0-10, streaming-kafka-0-10, yarn, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf--17-hadoop3-hive2.3
Expired
|
3.18 MB |
|