You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:22: object json4s is not a member of package org
[2023-08-31T10:28:17.143Z] [ERROR] import org.json4s._
[2023-08-31T10:28:17.143Z] [ERROR] ^
[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:23: object json4s is not a member of package org
[2023-08-31T10:28:17.143Z] [ERROR] import org.json4s.jackson.Serialization.writePretty
[2023-08-31T10:28:17.143Z] [ERROR] ^
[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:41: not found: value DefaultFormats
[2023-08-31T10:28:17.143Z] [ERROR] implicit val formats = DefaultFormats
[2023-08-31T10:28:17.143Z] [ERROR] ^
[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:43: not found: value writePretty
[2023-08-31T10:28:17.143Z] [ERROR] os.write(writePretty(queryMetas).getBytes)
[2023-08-31T10:28:17.143Z] [ERROR] ^
[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:22: Unused import
[2023-08-31T10:28:17.143Z] [ERROR] import org.json4s._
[2023-08-31T10:28:17.143Z] [ERROR] ^
[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:23: Unused import
[2023-08-31T10:28:17.143Z] [ERROR] import org.json4s.jackson.Serialization.writePretty
[2023-08-31T10:28:17.143Z] [ERROR] ^
[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:41: local val formats in method save is never used
[2023-08-31T10:28:17.143Z] [ERROR] implicit val formats = DefaultFormats
[2023-08-31T10:28:17.143Z] [ERROR] ^
[2023-08-31T10:28:17.143Z] [ERROR] 7 errors found
Steps/Code to reproduce bug
Please provide a list of steps or a code sample to reproduce the issue.
Avoid posting private or sensitive data.
Expected behavior
A clear and concise description of what you expected to happen.
Environment details (please complete the following information)
That's sort of what we need, but specifically for Databricks. In non-Databricks builds, the Maven artifacts and dependencies are explicit and we pick up json4s "for free," but in Databricks builds the artifacts and dependencies are not published and thus we need to manually add the jars to the classpath that we need. json4s is already referenced in other poms (e.g.: sql-plugin), and there's already a databricks profile in integration_tests that we need to update similarly to fix this.
how to trigger the build to include Databricks run?
Add [databricks] to the headline of the PR before triggering a build.
I'll post a PR to unblock CI for databricks builds.
Describe the bug
related to new merged #9089
Steps/Code to reproduce bug
Please provide a list of steps or a code sample to reproduce the issue.
Avoid posting private or sensitive data.
Expected behavior
A clear and concise description of what you expected to happen.
Environment details (please complete the following information)
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: