Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Presence of @debug can cause massive performance hit #28147

Closed
quinnj opened this issue Jul 17, 2018 · 10 comments
Closed

Presence of @debug can cause massive performance hit #28147

quinnj opened this issue Jul 17, 2018 · 10 comments
Labels
logging The logging framework

Comments

@quinnj
Copy link
Member

quinnj commented Jul 17, 2018

   _       _ _(_)_     |  A fresh approach to technical computing
  (_)     | (_) (_)    |  Documentation: https://docs.julialang.org
   _ _   _| |_  __ _   |  Type "?" for help, "]?" for Pkg help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.7.0-beta2.26 (2018-07-17 04:35 UTC)
 _/ |\__'_|_|_|\__'_|  |  Commit 299300a409* (0 days old master)
|__/                   |  x86_64-apple-darwin16.7.0

julia> using BenchmarkTools, Parsers
[ Info: Recompiling stale cache file /Users/jacobquinn/.julia/compiled/v0.7/Parsers/71Hb.ji for module Parsers

julia> b = @benchmarkable Parsers.xparse(io, Int) setup=(io = IOBuffer("0"))
Benchmark(evals=1, seconds=5.0, samples=10000)

julia> run(b)
BenchmarkTools.Trial:
  memory estimate:  1.28 KiB
  allocs estimate:  26
  --------------
  minimum time:     2.179 μs (0.00% GC)
  median time:      2.306 μs (0.00% GC)
  mean time:        11.404 μs (7.18% GC)
  maximum time:     77.453 ms (0.00% GC)
  --------------
  samples:          10000
  evals/sample:     1

julia>
jq-mbp:julia jacobquinn$ julia
               _
   _       _ _(_)_     |  A fresh approach to technical computing
  (_)     | (_) (_)    |  Documentation: https://docs.julialang.org
   _ _   _| |_  __ _   |  Type "?" for help, "]?" for Pkg help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.7.0-beta2.26 (2018-07-17 04:35 UTC)
 _/ |\__'_|_|_|\__'_|  |  Commit 299300a409* (0 days old master)
|__/                   |  x86_64-apple-darwin16.7.0

julia> using BenchmarkTools, Parsers
[ Info: Recompiling stale cache file /Users/jacobquinn/.julia/compiled/v0.7/Parsers/71Hb.ji for module Parsers

julia> b = @benchmarkable Parsers.xparse(io, Int) setup=(io = IOBuffer("0"))
Benchmark(evals=1, seconds=5.0, samples=10000)

julia> run(b)
BenchmarkTools.Trial:
  memory estimate:  32 bytes
  allocs estimate:  1
  --------------
  minimum time:     43.000 ns (0.00% GC)
  median time:      45.000 ns (0.00% GC)
  mean time:        2.906 μs (0.00% GC)
  maximum time:     28.578 ms (0.00% GC)
  --------------
  samples:          10000
  evals/sample:     1

julia>

Same code, the 1st example is with a line like @debug "xparse Int" in the function, the 2nd is with that line commented out.

(On the bright side, my morning I dedicated to solving performance problems just got a lot easier; comment out all the debug statements!)

@KristofferC
Copy link
Member

Can you make an example that doesn't include Parsers?

@quinnj
Copy link
Member Author

quinnj commented Jul 17, 2018

Sure

julia> function xparse(str)
           Base.parse(Int, str)
        end
xparse (generic function with 1 method)

julia> function xparse2(str)
           @debug "xparse"
           Base.parse(Int, str)
        end
xparse2 (generic function with 1 method)

julia> @btime xparse("0")
  38.532 ns (0 allocations: 0 bytes)
0

julia> @btime xparse2("0")
  1.133 μs (13 allocations: 624 bytes)
0

@KristofferC
Copy link
Member

KristofferC commented Jul 17, 2018

The four lines from the expanded macro still run even even when nothing is printed. Guess those are the cause of the slowdown.

julia/base/logging.jl

Lines 298 to 301 in 299300a

group = $group
_module = $_module
logger = current_logger_for_env(std_level, group, _module)
if !(logger === nothing)

Can do Logging.disable_logging(Logging.Debug) to get it fast again

julia> @btime xparse("0")
  42.516 ns (0 allocations: 0 bytes)
0

julia> @btime xparse2("0")
  49.323 ns (0 allocations: 0 bytes)
0

@maleadt
Copy link
Member

maleadt commented Jul 17, 2018

@vtjnash #26265

@JeffBezanson
Copy link
Member

Why is there a state where the debug log level is enabled (to the extent the fast-reject doesn't happen) but still nothing is printed? Are there separate global and local states and multiple loggers present by default?

@JeffBezanson JeffBezanson added the logging The logging framework label Jul 17, 2018
@KristofferC
Copy link
Member

Precisely because of #26265, right? So you can opt in with ENV variables.

@JeffBezanson
Copy link
Member

I see; it looks like the intent of that was to allow environment variables to change the logging level at any time? It doesn't seem like that could ever perform well. Maybe only check the env vars on startup?

@maleadt
Copy link
Member

maleadt commented Jul 26, 2018

Turns out most of the slowdown is due to generating a message group identifier from the sourceinfo filename by calling splitext/basename at runtime, which incidentally only happens when logging is enabled (hence the effect in #28147 (comment)). The environment override is OK optimized, and only reparses when the variable actually changes. Some timings:

# base version
julia> @btime @debug "test"
  704.184 ns (10 allocations: 464 bytes)

# fix group id generation
julia> @btime @debug "test"
  73.424 ns (0 allocations: 0 bytes)

# optimize current_logger_for_env
julia> @btime @debug "test"
  56.486 ns (0 allocations: 0 bytes)

# disable env_override_minlevel
julia> @btime @debug "test"
  7.246 ns (0 allocations: 0 bytes)

Is this overhead acceptable, or should we go all the way and disable that functionality? With many people doing REPL-based development, I can imagine it being useful to flip log levels at group granularity within a session.

julia> @btime xparse("0")
  32.842 ns (0 allocations: 0 bytes)
0

julia> @btime xparse2("0")
  93.021 ns (0 allocations: 0 bytes)
0

@ExpandingMan
Copy link
Contributor

Is this completely resolved by #28289?

@maleadt
Copy link
Member

maleadt commented Oct 8, 2018

There's still a slowdown, but that's unavoidable. Definitely not massive anymore, so I guess this can be closed.

@maleadt maleadt closed this as completed Oct 8, 2018
graeme-a-stewart added a commit to JuliaHEP/JetReconstruction.jl that referenced this issue Apr 17, 2024
These cause a real slowdown in the runtime (~20% degradation)
even when DEBUG isn't enabled

There are quite a few complaints about this in the forums, e.g.,
JuliaLang/julia#28147. It does seem to have improved, but for code like this it's still a
serious regression
graeme-a-stewart added a commit to JuliaHEP/JetReconstruction.jl that referenced this issue Apr 19, 2024
These cause a real slowdown in the runtime (~20% degradation)
even when DEBUG isn't enabled

There are quite a few complaints about this in the forums, e.g.,
JuliaLang/julia#28147. It does seem to have improved, but for code like this it's still a
serious regression
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
logging The logging framework
Projects
None yet
Development

No branches or pull requests

5 participants