Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reduce inference and optimization time (by turning it off) #104

Merged
merged 1 commit into from
Mar 11, 2021

Conversation

KristofferC
Copy link
Contributor

Benchmarking the script in https://carlobaldassi.github.io/ArgParse.jl/stable/#Quick-overview-and-a-simple-example-1:

Master

❯ hyperfine --warmup=1 'julia --project script.jl foo'
Benchmark #1: julia --project script.jl foo
  Time (mean ± σ):      3.166 s ±  0.064 s    [User: 3.386 s, System: 0.287 s]
  Range (min … max):    3.075 s …  3.238 s    10 runs

PR:

❯ hyperfine --warmup=1 'julia --project script.jl foo'
Benchmark #1: julia --project script.jl foo
  Time (mean ± σ):      1.271 s ±  0.016 s    [User: 1.524 s, System: 0.264 s]
  Range (min … max):    1.253 s …  1.294 s    10 runs

@carlobaldassi carlobaldassi merged commit 5ccf782 into carlobaldassi:master Mar 11, 2021
@carlobaldassi
Copy link
Owner

👍 thanks

@KristofferC KristofferC deleted the kc/inf branch March 11, 2021 16:08
@KristofferC
Copy link
Contributor Author

Would be nice with a new release whenever is suitable.

@oscardssmith
Copy link

Is there any other low hanging fruit for speeding this up (better precompile or something?)

@KristofferC
Copy link
Contributor Author

You could experiment with SnoopCompile.jl.

@carlobaldassi
Copy link
Owner

It's released now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants