Spark job running on GCP Dataproc: java.lang.ClassNotFoundException: org/eclipse/jetty/alpn/ALPN #2263
Labels
api: bigtable
Issues related to the googleapis/java-bigtable-hbase API.
type: question
Request for information or clarification. Not an issue.
I'm trying to run a Spark job on GCP Dataproc that stores to GCP Bigtable. The job works locally out of IntelliJ as well as from a local spark submit action. However, when submitting the Spark job in Dataproc in Spark Client or Cluster mode, I'm hitting some dependency conflict. I'm still pretty new to Java, Spark, and GCP, so I haven't been able to track down the cause of this DLL Hell. In a similar issue (#2140), some documentation (https://github.com/grpc/grpc-java/blob/master/SECURITY.md#netty) was posted that mentioned using grpc-netty-shaded when running out of Spark, but that didn't seem to work, unless I'm not excluding enough of the dependencies for the shaded dependency to take over. Any help is appreciated.
Relevant dependencies:
Initial error:
Secondary error:
The text was updated successfully, but these errors were encountered: