You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I added the amoro spark-3.2-runtime-0.5.0.jar package to the jars directory of Spark. When starting using SparkSQL, an error was reported and the class was not found: java. lang. NoClassDefFoundError: com/netease/arc/shadow/org/apache/Spark/sql/catalyst/analysis/ResolveProcedures
Exception in thread "main" java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: com/netease/arctic/shade/org/apache/spark/sql/catalyst/analysis/ResolveProcedures
at com.netease.arctic.spark.ArcticSparkExtensions.apply(ArcticSparkExtensions.scala:51)
at com.netease.arctic.spark.ArcticSparkExtensions.apply(ArcticSparkExtensions.scala:31)
at org.apache.spark.sql.SparkSession$.$anonfun$applyExtensions$1(SparkSession.scala:1197)
at org.apache.spark.sql.SparkSession$.$anonfun$applyExtensions$1$adapted(SparkSession.scala:1192)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$applyExtensions(SparkSession.scala:1192)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:956)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:54)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:328)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:160)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: com/netease/arctic/shade/org/apache/spark/sql/catalyst/analysis/ResolveProcedures
Anything else
No response
Are you willing to submit a PR?
Yes I am willing to submit a PR!
Code of Conduct
I agree to follow this project's Code of Conduct
The text was updated successfully, but these errors were encountered:
What happened?
I added the amoro spark-3.2-runtime-0.5.0.jar package to the jars directory of Spark. When starting using SparkSQL, an error was reported and the class was not found: java. lang. NoClassDefFoundError: com/netease/arc/shadow/org/apache/Spark/sql/catalyst/analysis/ResolveProcedures
Affects Versions
0.5.0
What engines are you seeing the problem on?
Spark
How to reproduce
Should I still want to add which dependency package, or in https://amoro.netease.com/download/. The package inside needs to be recompiled
Relevant log output
Anything else
No response
Are you willing to submit a PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: