Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Picocli extension - consider generating command metadata during build #16355

Open
mkouba opened this issue Apr 8, 2021 · 22 comments
Open

Picocli extension - consider generating command metadata during build #16355

mkouba opened this issue Apr 8, 2021 · 22 comments
Labels
area/cli Related to quarkus cli (not maven/gradle/etc.) good first issue Good for newcomers kind/enhancement New feature or request

Comments

@mkouba
Copy link
Contributor

mkouba commented Apr 8, 2021

ATM picocli builds the CommandSpec metadata using reflection every time the application is started. This is OK in most situations but may represent considerable overhead for applications that finish instantly, e.g. some Quarkus CLI commands. It might make sense to analyze the commands during build and either generate some "builder" classes or record the metadata (e.g. using AnnotationProxyBuildItem) and build the metadata in a recorder method.

@mkouba mkouba added the kind/enhancement New feature or request label Apr 8, 2021
@quarkus-bot quarkus-bot bot added the area/cli Related to quarkus cli (not maven/gradle/etc.) label Apr 8, 2021
@quarkus-bot
Copy link

quarkus-bot bot commented Apr 8, 2021

/cc @ebullient, @gsmet, @maxandersen

@mkouba
Copy link
Contributor Author

mkouba commented Apr 8, 2021

WDYT @mgorniew?

@gsmet
Copy link
Member

gsmet commented Apr 8, 2021

Yeah I had the same thought the other day. I'm just unsure how much time it takes. Given displaying the help is slow enough even in native, I suppose it takes quite some time.

@mkouba
Copy link
Contributor Author

mkouba commented Apr 8, 2021

I'm just unsure how much time it takes.

My super unreliable measurement showed that it could be hundreds of ms (quarkus cli has 1 top command + 10 subcommand classes).

@maxandersen
Copy link
Member

Given displaying the help is slow enough even in native, I suppose it takes quite some time.

big part of this one is probably due the console width calculation which can be expensive even in native.

that said - i do think picoclis annotation parsing is quite slow - its signficiant in jbang too.

@mkouba
Copy link
Contributor Author

mkouba commented Apr 8, 2021

Given displaying the help is slow enough even in native, I suppose it takes quite some time.

big part of this one is probably due the console width calculation which can be expensive even in native.

This part should be solved in #16320.

@remkop
Copy link

remkop commented Apr 8, 2021

I’ve been wanting to do something like this: remkop/picocli#539 but never got around to it.

@remkop
Copy link

remkop commented Apr 8, 2021

Yeah I had the same thought the other day. I'm just unsure how much time it takes. Given displaying the help is slow enough even in native, I suppose it takes quite some time.

Can you put some numbers on that? How long does displaying the help take in native and in JIT mode?

The reason I'm asking is to clarify what exactly is the problem that this ticket is trying to solve:

  • What are the current execution times for the commands
  • What are the target execution times for these commands?

@quintesse
Copy link
Contributor

quintesse commented May 27, 2021

@remkop I don't have the exact numbers at hand (they're somewhere in a chat) but when I was investigating the slow startup of Jbang I found that around .6s of a 1.2s startup was due to Picocli setup. For a tool that's supposed to enable Java as a scripting language adding more than a second to the startup will make people think twice before dropping Perl/Node/Bash in favor of Java. As a comparison java and javac start up in about 0.1s and 0.3s so ideally we don't want to be adding too much on top of that. So remkop/picocli#539 is very interesting for us as well.

@ebullient
Copy link
Member

ebullient commented May 27, 2021

Now that I've done the crazy things and restructured the entire CLI to be a little cleaner.. I could also just say eff the annotations and code it directly.

Annotations are cool and all, but we don't actually have to use them..
Is there a real reason to use them? Some secret benefit I'm not seeing?
(it would kind of re-uglify some things.. but if that is the tradeoff that needs to be made.. that's the tradeoff that needs to be made)

Need to now revisit native mode, too

@quintesse
Copy link
Contributor

@ebullient I was considering the same for Jbang and my worry is that it will become less maintainable (somewhat ) and less accessible to people who are not familiar with the code.

@ebullient
Copy link
Member

Annotations are magic .. and that can be less maintainable, too. Depends on how cleanly you can do it, I suppose. I don't necessarily find annotations any more readable (you have to know how they go together.. )

I have a few other things to chew through.. but I might try it and see how bad it gets.. I mean, I've already written the CLI once.. so it's all fresh in my head.. :-P

@maxandersen
Copy link
Member

Annotations makes sense to be declarative and quarkus be smarter about it.

Even doing it programmatically will have an overhead compared to have the quarkus picocli extension do it during build and just load static model at startup.

@maxandersen
Copy link
Member

@quintesse consider converting quintesse/jbang@15e7e85 to quarkus extension?

@lppedd
Copy link

lppedd commented Sep 13, 2022

Is this still valid?
I've built native executables on Windows, and I've noticed the Micronaut starter project measures at around 200-250ms for -h, while the Quarkus one at around 350ms+. Is this to be expected?
It's not a big deal, but it's noticeable.

@maxandersen
Copy link
Member

@lppedd Is micronaut using picocli or something else? I haven't made the comparison.

But yes, with picocli i'm convinced you'll be able to shave of 100-200 ms of startup if the picocli extension would do its annotation model scanning at build time rather than at runtime.

@quintesse
Copy link
Contributor

quintesse commented Sep 27, 2022

@maxandersen but the current picocli extension already does that, right?
AFAICT it's creating the CommandLine object that's expensive, because when I created that programmatically (ie without using annotations) it was hardly any faster.

That might be something that's easy enough to do, but I've looked at the Quarkus extension documentation and it's far from trivial, so I think it's not something I'll be able to pick up in 5 minutes. So if somebody with greater experience writing extensions could take a look at that that would be great.

It's basically making sure that new CommandLine(instance_of_TopCommand_marked_class_here) gets created and cached at build time.

Edit: btw, doing that probably also means that it's no longer necessary to cache the annotations because they are no longer required once the CommandLine object has been created.

@ebullient
Copy link
Member

That only works of that marked class doesn't use any beans. Curious if using a static init to create/initialize the command instance programmatically would make a difference (to prove your point), as static init is baked into the native image, IIRC.

@quintesse
Copy link
Contributor

@ebullient that's more or less what I did here: https://github.com/quintesse/jbang/blob/15e7e85a4932722353a95483787271da083d50a6/annotation/src/main/java/MyProcessor.java , it's a build processor that takes that PicoCli annotations at build time and generates code that does the same thing.

Unfortunately there was (is?) no API in PicoCli that I could hook into to turn annotations into their appropriate models so I had to do it the hard way (which means there's only partial support for PicoCli's features, basically just enough to make it work with our project).

But perhaps someone can use it to show if doing it like that makes a difference or not.

@mkouba
Copy link
Contributor Author

mkouba commented Sep 30, 2022

@quintesse So if I understand it correctly the original picocli annotation processor does not generate any class but merely validates the model and creates some config files for graalvm native image? Your annotation processor on the other hand generates a class that builds a CommandSpec that could passed to the picocli.CommandLine(Object) constructor. Am I correct?

When I filed this issue I was more thinking about a quarkus-specific way of generating this stuff because annotation processors do not quite fit the quarkus build process but the goal is essentially the same, i.e. get rid of reflection discovery at runtime.

I'd like to try it out but I don't know when I'll get to look at it...

@quintesse
Copy link
Contributor

quintesse commented Sep 30, 2022

Am I correct?

Correct. And I created the processor to see if I could create the model at compile time and because we want our application to be as small as possible I didn't want to depend on any framework, that's why it was not written with Quarkus in mind. I abandoned the PoC when I found no noticeable improvement in startup time.

But the experiment's results at least suggest it's not the annotation parsing that's slowing things down but constructing / initializing the model.

So that's why it was suggested that perhaps Quarkus' way of statically initializing objects could be useful.

(I'm probably repeating myself too much, but I just want to make sure things are clear heh)

@mkouba
Copy link
Contributor Author

mkouba commented Oct 4, 2022

Thanks @quintesse I'll try to do some quick tests (when I have spare cycles ;-).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/cli Related to quarkus cli (not maven/gradle/etc.) good first issue Good for newcomers kind/enhancement New feature or request
Projects
Status: Todo
Development

No branches or pull requests

7 participants