-
-
Notifications
You must be signed in to change notification settings - Fork 286
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error: Could not find or load main class scalapb.ScalaPbCodeGenerator #1173
Comments
I get this eror when trying to execute scalapbc/bin/protoc-gen-scala |
Hi @Parikshit-DartCoder , I am confused about this problem report: in the first message you give your sbt configuration, and in the second message you state you are getting an error from a command line tool that is completely unrelated to your sbt configuration. To help further, I have two follow up questions:
Thanks! |
Hi @thesamet Thanks for the reply. Your sbt config includes a plugin (sbt-protoc) that performs source code generation. And if protoc.sbt is in the projects directory, this config should work. Can you elaborate why you are turning to the command line tool given that you seem to have an sbt setup? Is there any problem using sbt to generate code? Answer: Protoc.sbt is in projects directory. We use the script below to generate scala code. I ran below build commands: ###########SCRIPT############################ 2.If you need the command-line tool (protoc-gen-scala) to work - please provide instructions to reproduce the problem. Which operation system, jvm, which link you downloaded protoc-gen-scala from and so on. config details: os: 4.14.232-176.381.amzn2.x86_64 Plugins Enabled: |
Thanks for the info. It looks like the issue with the script is a duplicate of #1114. Please update your script to download protoc-gen-scala to 0.11.3 and try again. It looks like the script also unnecessarily downloads and extracts scalapbc, though it only relies protoc-gen-scala. The download of scalapbc can be safely removed. The Finally, for the failure in sbt you online provided this: "fails with message scala class not found as its not generated" can you provide the full output from sbt? In any case, I didn't understand why you need both source code generation in sbt and the script to work - there's an overlap. My suggestion would be to avoid the script and focus on getting the source code generation to work within sbt - this is the recommended approach. |
@thesamet Sure I will try the steps you said and update in the thread here. Also the error message I got is from running the script not sbt. When i execute sbt compile I get error as the scala classes not found.My current flow is: |
sbt-protoc needs to know where your proto files are so it can generate code for them. The default is Also, it seems like you will need to add |
@thesamet ohk let me try all the steps you mentioned and I will update you with the results |
@thesamet I changed my build.sbt to below, and executing sbt compile,still scala clases not generated |
name := "protobuf_demo" version := "0.1" scalaVersion := "2.12.10" Compile / PB.protoSources := Seq(sourceDirectory.value / "/home/ec2-user/paysihistory/src/main/protobuf/github.com/gogo/protobuf/gogoproto") PB.targets in Compile := Seq( |
Directory structure: |
The
because we are appending to |
@thesamet updated build.sbt name := "protobuf_demo" ERROR |
FYI: |
Hi @thesamet I generated the Scala class from protobuf, but now when I try to use them to package I get below errors ,let me know if there is anything I missed: ** Unable to find encoder for type pb.PaysiHistory. An implicit Encoder[pb.PaysiHistory] is needed to store pb.PaysiHistory instances in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases.** |
@Parikshit-DartCoder , have you read the section marked "IMPORTANT" around which imports you should and shouldn't have in the docs? If so, what have you tried, and can you provide a reproducible example of the problem? |
Hi @thesamet I have fixed this in my code it was due to the way I was converting the RDD to data frame which is not supported in Spark 3.0.1 for protobuf based serializations.I have the jar created now using sbt package. And the earlier issue I was facing because of gogo.proto git repo there are many unwanted photos in that which I don't need so I just extracted out the part I want and copy them into my photo sources dir . |
It sounds like the original issue (reported in the first message) has been resolved through a version upgrade and correcting SBT settings. Closing this issue now, and feel free to open new tickets whether there are other issues. For general questions or guidance, please use gitter or stackoverflow. |
Build.sbt
name := "consumers"
version := "0.1"
organization in ThisBuild := "com.abc"
scalaVersion in ThisBuild := "2.12.10"
val sparkV = "3.0.1"
lazy val sparkDependencies = Seq(
"org.apache.spark" %% "spark-core" % sparkV,
"org.apache.spark" %% "spark-streaming" % sparkV,
"org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkV,
"org.apache.spark" %% "spark-sql-kafka-0-10" % sparkV,
"org.apache.spark" %% "spark-sql" % sparkV,
"org.apache.spark" %% "spark-mllib" % sparkV,
"com.thesamet.scalapb" %% "sparksql-scalapb" % "0.11.0"
)
Compile / PB.targets := Seq(
scalapb.gen() -> (Compile / sourceManaged).value / "scalapb"
)
lazy val assemblySettings = Seq(
assemblyJarName in assembly := name.value + ".jar",
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
)
lazy val compilerOptions = Seq(
"-unchecked",
"-feature",
"-language:existentials",
"-language:higherKinds",
"-language:implicitConversions",
"-language:postfixOps",
"-deprecation",
"-encoding",
"utf8"
)
Protoc.sbt
addSbtPlugin("com.thesamet" % "sbt-protoc" % "1.0.2")
libraryDependencies += "com.thesamet.scalapb" %% "compilerplugin" % "0.11.0"
The text was updated successfully, but these errors were encountered: