fix md #104
This run and associated checks have been archived and are scheduled for deletion.
Learn more about checks retention
build_main.yml
on: push
Run
/
Check changes
47s
Run
/
Breaking change detection with Buf (branch-3.5)
1m 11s
Run
/
Run TPC-DS queries with SF=1
1h 16m
Run
/
Run Docker integration tests
43m 22s
Run
/
Run Spark on Kubernetes Integration test
1h 16m
Matrix: Run / build
Matrix: Run / java-other-versions
Run
/
Build modules: sparkr
40m 58s
Run
/
Linters, licenses, dependencies and documentation generation
2h 7m
Matrix: Run / pyspark
Annotations
16 errors and 1 warning
Run / Build modules: pyspark-connect
Process completed with exit code 19.
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-6da97a8b419cc78e-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-b5435f8b419de38d-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$679/0x00007ff9f85c36c8@213980c7 rejected from java.util.concurrent.ThreadPoolExecutor@1598298c[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 320]
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$679/0x00007ff9f85c36c8@55b64432 rejected from java.util.concurrent.ThreadPoolExecutor@1598298c[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 319]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-a3a8818b41b49f05-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-6dbbd28b41b5c812-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-e60ece8b41b9ea29-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-fc7fc6bc3b3b4b3d86b362703d5b71f4-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-fc7fc6bc3b3b4b3d86b362703d5b71f4-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
Run / Build modules: streaming, sql-kafka-0-10, streaming-kafka-0-10, mllib-local, mllib, yarn, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf
The runner has received a shutdown signal. This can happen when the runner service is stopped, or a manually started runner is canceled.
|
|
python/pyspark/sql/tests/connect/test_parity_pandas_map.py.test_other_than_dataframe_iter:
python/pyspark/sql/tests/connect/test_parity_pandas_map.py#L1
<_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:41893: Failed to connect to remote host: Connection refused"
debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:41893: Failed to connect to remote host: Connection refused {created_time:"2023-10-18T07:33:15.31094869+00:00", grpc_status:14}"
>
|
python/pyspark/sql/tests/connect/test_parity_pandas_map.py.test_self_join:
python/pyspark/sql/tests/connect/test_parity_pandas_map.py#L1
<_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:41893: Failed to connect to remote host: Connection refused"
debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:41893: Failed to connect to remote host: Connection refused {created_time:"2023-10-18T07:43:26.722565695+00:00", grpc_status:14}"
>
|
|
Run / Build modules: pyspark-errors
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
site
Expired
|
59.3 MB |
|
test-results-catalyst, hive-thriftserver--17-hadoop3-hive2.3
Expired
|
2.79 MB |
|
test-results-core, unsafe, kvstore, avro, network-common, network-shuffle, repl, launcher, examples, sketch, graphx--17-hadoop3-hive2.3
Expired
|
2.5 MB |
|
test-results-docker-integration--17-hadoop3-hive2.3
Expired
|
119 KB |
|
test-results-hive-- other tests-17-hadoop3-hive2.3
Expired
|
910 KB |
|
test-results-hive-- slow tests-17-hadoop3-hive2.3
Expired
|
852 KB |
|
test-results-pyspark-connect--17-hadoop3-hive2.3
Expired
|
267 KB |
|
test-results-pyspark-core, pyspark-streaming--17-hadoop3-hive2.3
Expired
|
80.5 KB |
|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3
Expired
|
1.31 MB |
|
test-results-pyspark-pandas--17-hadoop3-hive2.3
Expired
|
1.14 MB |
|
test-results-pyspark-pandas-connect-part0--17-hadoop3-hive2.3
Expired
|
1.06 MB |
|
test-results-pyspark-pandas-connect-part1--17-hadoop3-hive2.3
Expired
|
971 KB |
|
test-results-pyspark-pandas-connect-part2--17-hadoop3-hive2.3
Expired
|
637 KB |
|
test-results-pyspark-pandas-connect-part3--17-hadoop3-hive2.3
Expired
|
326 KB |
|
test-results-pyspark-pandas-slow--17-hadoop3-hive2.3
Expired
|
1.85 MB |
|
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3
Expired
|
405 KB |
|
test-results-sparkr--17-hadoop3-hive2.3
Expired
|
280 KB |
|
test-results-sql-- extended tests-17-hadoop3-hive2.3
Expired
|
2.96 MB |
|
test-results-sql-- other tests-17-hadoop3-hive2.3
Expired
|
4.24 MB |
|
test-results-sql-- slow tests-17-hadoop3-hive2.3
Expired
|
2.76 MB |
|
test-results-tpcds--17-hadoop3-hive2.3
Expired
|
21.8 KB |
|
unit-tests-log-pyspark-connect--17-hadoop3-hive2.3
Expired
|
1.28 GB |
|