Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

scalapb_proto_library doesn't support custom TypeMappers anymore? #751

Open
sholokhov opened this issue May 14, 2019 · 19 comments
Open

scalapb_proto_library doesn't support custom TypeMappers anymore? #751

sholokhov opened this issue May 14, 2019 · 19 comments

Comments

@sholokhov
Copy link

Hi!

After switching on aspect-based protobuf compilation (#700) it seems like it's no longer possible to use custom TypeMapper from ScalaPB. At the first glance the reason is that bazel tries to generate scala code from proto_library (%name-fast rule) but there is no way to specify scala libraries for proto_library (at least I haven't found).

It feels like it used to work before because previously bazel was using scalapb_proto_library with its deps to generate final scala code, where you can specify additional custom classes.

I've also prepared small example which reproduces this issue.

Bazel version - 0.25.2
rules_scala - b2e12827240fc488bbe5ebbbbcbc7a4f12ca02f2

@sholokhov sholokhov changed the title scalapb_proto_library doesn't support scala_library as a dependency anymore? scalapb_proto_library doesn't support custom TypeMappers anymore? May 14, 2019
@johnynek
Copy link
Member

cc @ianoc-stripe who may have some ideas.

Would you be willing to make PR with a unit test that you would like to pass and we can see how we can get it to pass. My guess is that we will need to add the ability of adding extra dependencies to the scalapb toolchain since we can't customize the individual aspect generated targets in any way that I know of.

@ianoc
Copy link
Contributor

ianoc commented May 14, 2019

@sholokhov you can add the dependencies to the scalapb toolchain which lets you add more deps. It will get added to all of the scalapb targets in one swoop in the repo.

You can define
https://github.com/bazelbuild/rules_scala/blob/master/scala_proto/scala_proto_toolchain.bzl#L46
part of your toolchain to contain the reference to your scala libraries.

@sholokhov
Copy link
Author

sholokhov commented May 15, 2019

@ianoc thanks for helping, but now it fails with following error (during well-known proto files compilation):

sholokhov$ bazel build //scala-app:messages_scala_proto
INFO: Analyzed target //scala-app:messages_scala_proto (0 packages loaded, 0 targets configured).
INFO: Found 1 target...
ERROR: /private/var/tmp/_bazel_sholokhov/557977177af35e2d2806c45eff908f1d/external/com_google_protobuf/BUILD:248:2: scala @com_google_protobuf//:descriptor_proto-fast failed (Exit 1) scalac failed: error executing command bazel-out/host/bin/external/io_bazel_rules_scala/src/java/io/bazel/rulesscala/scalac/scalac @bazel-out/darwin-fastbuild/bin/external/com_google_protobuf/descriptor_proto-fast_scalac_worker_input

Use --sandbox_debug to see verbose messages from the sandbox
error: error while loading Object, Missing dependency 'object scala in compiler mirror', required by /modules/java.base/java/lang/Object.class
error: scala.reflect.internal.MissingRequirementError: object scala in compiler mirror not found.
        at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:24)
        at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:25)
        at scala.reflect.internal.Mirrors$RootsBase.$anonfun$getModuleOrClass$4(Mirrors.scala:61)
        at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
        at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:73)
        at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:179)
        at scala.reflect.internal.Definitions$DefinitionsClass.ScalaPackage$lzycompute(Definitions.scala:196)
        at scala.reflect.internal.Definitions$DefinitionsClass.ScalaPackage(Definitions.scala:196)
        at scala.reflect.internal.Definitions$DefinitionsClass.ScalaPackageClass$lzycompute(Definitions.scala:197)
        at scala.reflect.internal.Definitions$DefinitionsClass.ScalaPackageClass(Definitions.scala:197)
        at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1464)
        at scala.tools.nsc.Global$Run.<init>(Global.scala:1194)
        at scala.tools.nsc.Driver.doCompile(Driver.scala:46)
        at scala.tools.nsc.MainClass.doCompile(Main.scala:32)
        at scala.tools.nsc.Driver.process(Driver.scala:67)
        at io.bazel.rulesscala.scalac.ScalacProcessor.compileScalaSources(ScalacProcessor.java:219)
        at io.bazel.rulesscala.scalac.ScalacProcessor.processRequest(ScalacProcessor.java:69)
        at io.bazel.rulesscala.worker.GenericWorker.run(GenericWorker.java:114)
        at io.bazel.rulesscala.scalac.ScalaCInvoker.main(ScalaCInvoker.java:41)
two errors found
java.lang.RuntimeException: Build failed
        at io.bazel.rulesscala.scalac.ScalacProcessor.compileScalaSources(ScalacProcessor.java:244)
        at io.bazel.rulesscala.scalac.ScalacProcessor.processRequest(ScalacProcessor.java:69)
        at io.bazel.rulesscala.worker.GenericWorker.run(GenericWorker.java:114)
        at io.bazel.rulesscala.scalac.ScalaCInvoker.main(ScalaCInvoker.java:41)
Target //scala-app:messages_scala_proto failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 1.926s, Critical Path: 1.70s
INFO: 0 processes.
FAILED: Build did NOT complete successfully

I've updated my repo with example, can you take a look please? Am I missing something?

P.S: Correct me if I'm wrong, but if this schema works it means that all those implicit deps are gonna be propagated in all scalapb_proto targets. But if I have service specific mappers, I don't want to include them in other services. So there is no workaround for it so far?

@ianoc
Copy link
Contributor

ianoc commented May 15, 2019

@sholokhov in the example, (sorry i should have been clearer on this) when you set the implicit compile deps you replace the existing ones.
ianoc/bazel-scalapb-example@42c3eec

would show you making the example work. This obviously isn't great here since we are referencing something in private in the rules_scala for that default list (Internally we replace all of these deps pointed to targets fetched via bazel-deps, so we don't see this sort of challenge :/ ).

We could make that a more public reference of DEFAULT_COMPILE_DEPS or add another toolchain entry to append a set of the users deps too.

But to try understand what you generally do here/if this is a good fit. The case in that repo looks to be mapping types on a message object, is that your main use case or services?

Currently the aspect based model aligns pretty closely with the java proto rules aspects, the main deviation right now is that we have combined scala type generation with the grpc service generation. The java rules only use the aspect for the types but not the services. If there was a solid motivation that only applied to services it might make sense for us to split the two up too and not use the aspect for grpc.

Another alternative is that if we trust the user not to 'cross' paths with the aspect and thus possibly generate 2 different sets of code for the same proto we could bring back some of the proto -> src jar/library methods too.

@sholokhov
Copy link
Author

Thanks for help, now it works for me despite that I don't really like to have a reference on private part of rules_scala. I think the idea with DEFAULT_COMPILE_DEPS is actually pretty good.

But to try understand what you generally do here/if this is a good fit. The case in that repo looks to be mapping types on a message object, is that your main use case or services?

We have a monorepo with all our services which are mostly gRPC based and services with type mappers are an exceptions. But the bad thing now is that we have one service that depends on mongo db driver and we use mapper to map some mongo entities:

message Example {
    string _id = 1 [(scalapb.field).type = "org.bson.types.ObjectId"];
}

So now the only way to make it work is actually include dependency on mongo db in proto toolchain which will affect all our services, even if they don't need mongo at all.

Another thing (the same problem actually) - if we have some service specific types (like Gender in example) - we also forced to include them in toolchain, so they're gonna be propagated to all our grpc/proto services, so it feels really messy :(

@ianoc
Copy link
Contributor

ianoc commented May 16, 2019

This is super interesting to see/hear. We don't use the type converters ourselves today either, so we haven't ran into this problem. Though it does sound thorny. I don't believe the java protobuf supports this at all so it doesn't come up there. Separating the grpc bindings from the scala generation itself doesn't sound like it would help here since this is all on the message types.

There are 3 shapes of solutions here i can think of off hand:

  1. Have a more manual proto -> scala_library rule around like the old one.
    The draw back here is you can have duplicate classes(which the old setup did a lot of) which may not be the same depending on configuration. and you wouldn't get nice aspect based transitive wins around the classpath too easily.
  2. Add to the toolchain a Map[String, List[String]] where we try set the strings in the values here as dependencies when the Label we are evaluating matches the string key. (There isn't a Map[Label, List[Label]] attr type) .
    Main downside here is that this has to be configured centrally in the toolchain, but you would keep all the aspect related wins too.

@purkhusid
Copy link

This is also blocking me in migrating our Scala codebase to Bazel. Is it impossible to add extra dependencies to targets that are aspect based @ianoc @ittaiz ?

@ittaiz
Copy link
Member

ittaiz commented Sep 19, 2020

@purkhusid Have you read the thread thoroughly?
@ianoc gave a way to add extra deps which isn't that good but works and suggested two more solutions which aren't implemented but might be given we understand the need and use-cases.
cc @simuons who is our (Wix) master of bazel+proto

@purkhusid
Copy link

@ittaiz The workaround posted earlier in the thread is no longer viable. The implicit_compile_deps option is no longer there.
It was removed in #763.

Of the 2 solutions that @ianoc proposed i think 2. is better. Adding a bunch of dependencies to the toolchain will result in a lot of cache invalidation.

What I was wondering is if it´s possible to have our cake and eat it too i.e. if it´s possible to have the rule aspect based and also add extra deps on the target level. I´m not experienced enough with aspects to know the answer but it would be awesome if some aspect expert could chime in.

@ittaiz
Copy link
Member

ittaiz commented Sep 19, 2020

oh, in that case I'm fairly certain it's not an option.
#763 ? Are you sure? That PR wasn't merged.
Also I saw that @liucijus had an opinion here

@purkhusid
Copy link

@ittaiz Sorry, wrong link. Here is the right one #801

@ittaiz
Copy link
Member

ittaiz commented Sep 19, 2020

right, that one I know.

@purkhusid
Copy link

@ittaiz Do you use the aspect based rules at wix or do you have something custom? If you have a custom rule, is it something that you could share?

@ittaiz
Copy link
Member

ittaiz commented Sep 25, 2020

We use the aspect based rules. @simuons can you share a bit more about what you're working on since I have a feeling it somewhat correlates to @purkhusid's needs

@simuons
Copy link
Collaborator

simuons commented Sep 25, 2020

Hi @purkhusid, currently at wix we use aspects based rules. But right now we are moving to combination of aspects based rules for messages and custom rules for services.
We want to use aspects for messages because messages might include other messages from other protos and we want to traverse that deps tree and generate all code needed.
Services on the other hand do not refer other services from other protos. So we do not need deps traversal here. Also we have different flavours of grpc code so we use different generators. Basically that custom rule takes proto_library as inputs for protoc and scala_library as inputs for scalac. And it generates code to direct proto_library only (doesn't traverse deps).
We do not use scalapb options. And I'm not sure that separation of rules would help in your case (sine you need extra deps for messages/aspects). Theoretically you could use that "direct" rule for messages (and add needed dependencies) as well but should be careful to avoid duplicate code.
Looks like this separation of rules applies in java to. java_proto_library generates code for messages (and is aspects based) and for grpc there is a separate rule https://github.com/grpc/grpc-java/blob/master/java_grpc_library.bzl#L171
Just sharing what we have at wix as you asked.

@purkhusid
Copy link

@simuons Are you planning on adding this rule to rules_scala? Would it be something similar to what I'm trying to accomplish here: #1114 ?

I'm also curious what you mean by "be careful to avoid duplicate code"?

@simuons
Copy link
Collaborator

simuons commented Sep 25, 2020

@purkhusid

Would it be something similar to what I'm trying to accomplish here: #1114 ?

Yes it looks very similar. In addition I need ability to change generator and take more than one proto_library as inputs (for legacy reasons not really important).

Are you planning on adding this rule to rules_scala?

Probably yes, but not sure in which form. I would contribute it as a whole rule if I'll make it general enough or it might be a set of functions which would ease writing such rule (if the end rule would contain to many Wix specifics). For example generate_code and compile_generated_code. You also saw a need to extract those.

I'm also curious what you mean by "be careful to avoid duplicate code"?

It's less of a problem in #1114 since you generating code and compiling it for single proto_library. It was a lot worse when we were generating code for all transitive protos. However it might happen with your proposal as well. Imagine you have proto file //proto/a and two packages which generate code for that proto //pkg1 and //pkg2. So both would generate code in their own context and you would end up with two jars with same content //pkg1/a.jar and //pkg2/a.jar. Aspect give this nice property where generation happens in context of //proto and both packages //pkg1 and //pkg2 would point to same jar. I hope I'm not confusing you.

@purkhusid
Copy link

Ah, that clears things up, If I understand you correctly then this would not be a problem if the rule would require all dependencies to be declared. E.g.

proto_library(
  name = "proto1"
  ...
)

proto_library(
  name = "proto2"
  deps = [":proto1"]
)

new_scala_proto_library(
  name = "scala_proto1"
  proto = ":proto1",
)

new_scala_proto_library(
  name = "scala_proto2"
  proto = ":proto2",
  deps = [""scala_proto1"]
)

I was trying to go that route with my rule in #1114 but I'm not experienced enought with bazel/rules_scala to figure out how I exclude the transitive dependencies in the compilation.

I think this is the route that rules_go went with but there you do have gazelle to help you with the build file generation.

@virusdave
Copy link
Contributor

Any progress on this? I ran into several of these problems recently as well. I'm also dealing with a large monorepo, so just "add a new common dependency that applies to all scala_proto_library targets is a terrible approach (and probably a nonstarter, sadly). Further, even if I DID this approach, i think it cannot apply to TypeMappers that map Message types (since to compile the TypeMapper implicit you need access to the generated code for the Message you're mapping, but to produce that you now need to be able to compile the TypeMapper (which was added as a toolchain dep for scalapb). Ouch!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants