-
Notifications
You must be signed in to change notification settings - Fork 278
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WIP: Generate scala code from protobuf using @com_google_protobuf//:protoc #705
Conversation
…e scala_proto_gen produces only sources
…etated to java_conversions only
BTW I'm going to test this on our codebase. I should have done it before opening PR but better later than never I guess. |
Looks like we have two racing proto rewrites. Can you and @ianoc-stripe look at each other’s changes and comment on what aspects from each should be used? |
See #700 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I’m starting to wonder if we should have two separate files for two different implementations if there are trade offs here.
Like unless I am wrong this is trading performance for ease of integration with protoc. For us, and I think many, we don’t want to trade performance for ease of configuration, we’d probably just want to do the extra config work.
inputs = [ctx.executable._protoc, ctx.executable.plugin] + descriptors, | ||
outputs = [srcdotjar], | ||
arguments = [ | ||
"--plugin=protoc-gen-scala=" + ctx.executable.plugin.path, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is going to prevent using workers right? So you have to spin up a new JVM for each file, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes it would prevent using workers. I thought this is what is happening now as well. Now I think my assumptions might be wrong as protoc-bridge
is doing some tricks to reuse jvm. I'll dig more and update.
@johnynek I think @simuons needs to upload a diagram of our (rules_scala) current process map when using proto. |
@ittaiz @johnynek I didn't mean to race/interfere with aspects PR. I'm trying to solve different issue here (ie use single PS thanks for your time and sorry for delayed response. |
@lberki @philwo
May I ask if java rules reuse workers for java generation from proto or
does this just work natively without opening a jvm?
…On Sun, 3 Mar 2019 at 16:22 Simonas Pinevičius ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In scala_proto/scala_proto.bzl
<#705 (comment)>
:
> + return file.short_path
+
+def _scala_proto_gen_impl(ctx):
+ protos = [p for p in ctx.attr.deps if hasattr(p, "proto")] # because scalapb_proto_library passes JavaInfo as well
+ descriptors = depset([f for dep in protos for f in dep.proto.transitive_descriptor_sets]).to_list()
+ sources = depset([f for dep in protos for f in dep.proto.transitive_sources]).to_list()
+ roots = depset([f for dep in protos for f in dep.proto.transitive_proto_path]).to_list()
+ inputs = depset([_strip_root(f, roots) for f in _retained_protos(sources, ctx.attr.blacklisted_protos)]).to_list()
+
+ srcdotjar = ctx.actions.declare_file("_" + ctx.label.name + "_src.jar")
+
+ ctx.actions.run(
+ inputs = [ctx.executable._protoc, ctx.executable.plugin] + descriptors,
+ outputs = [srcdotjar],
+ arguments = [
+ "--plugin=protoc-gen-scala=" + ctx.executable.plugin.path,
Yes it would prevent using workers. I thought this is what is happening
now as well. Now I think my assumptions might be wrong as protoc-bridge
is doing some tricks to reuse jvm. I'll dig more and update.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#705 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABUIF3ehQk6UW9uT-VoB-pGNQQy1Am6Iks5vS9qYgaJpZM4bYh17>
.
|
Apparently my assumptions about worker usage when generating scala code were wrong. Code generation indeed happens in worker by the trick of Closing this in favour of less aggressive attempt to reuse |
@ittaiz : iff something goes through |
Resolves #690
Introduces new rule
scala_proto_gen
which@com_google_protobuf//:protoc
. This means single version ofprotoc
forproto_library
andscalapb_proto_library
(instead of one which comes fromprotoc-jar
)scala_proto_srcjar
was left intact so it wouldn't break existing users. My suggestion is to deprecate it and remove with subsequent pr.scala_proto_gen
generates code in same way asscala_proto_srcjar
does ie all direct and transitive deps are generated into single srcjar (no aspects in this pr)scalapb_proto_library
is implemented on top ofscala_proto_gen
. One thing I don't like though is that I had to create "private" rule to support JavaInfo inscalapb_proto_library
deps (maybe there is a better way to achieve that). I believescalapb_proto_library
deps should accept onlyproto_library
asjava_proto_library
does.--proto_path
were replaced with--descriptor_set_in
this removes burden of constructing-I
aliases.I haven't done benchmarks other than
bazel clean
followed bybazel test //test/proto/...
. I got same times in this pr and master. I don't think this pr has impact on performance. In current implementation protoc process is launched from worker which in turn calls executable plugin and I think in actual code generation phase workers are not used (I might be wrong here).