-
-
Notifications
You must be signed in to change notification settings - Fork 439
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Possible performance regression with Coverage >=5 #1037
Comments
In the codebase, is also the case that some package is not installed, or this is just a way to reproduce the slow down? |
Thanks, this is just a way to reproduce the slowdown. I found that I had to point Coverage at a large enough codebase/source tree to get the problem that we're having with our own codebase to show up in a reproducible manner. If I use only |
Something that might not be clear from the output is that in both versions the pytest run itself is very fast (0.01s), as expected from the simplicity of the test. The slowdown itself happens after pytest has written its test report. |
I see something like what you are seeing, but not as extreme. Using pandas as the burden, I get 0.604s for 4.5.3 and 1.369s for 5.3. I don't understand why it would be slower though, so I will look into it. |
I've fixed this in 3274cba. Can you try installing it to see if it works in your environment?
|
Thanks for the quick fix! I confirm that this also solves the slowdown on my case. |
This is now released as part of coverage 5.3.1. |
thanks! |
Describe the bug
Total coverage run time is considerable bigger in Coverage 5, as compared to 4.5.3
To Reproduce
How can we reproduce the problem? Please be specific. Don't just link to a failing CI job. Answer the questions below:
What version of Python are you using?
3.6.7
What version of coverage.py are you using? The output of
coverage debug sys
is helpful.Both versions of Coverage are installed with
pip
.For Coverage
5.3
:For Coverage
4.5.3
:pip freeze
is helpful.Other than Coverage itself, these are the packages:
I'm running a single pytest test module:
I wrote the following script to output the Coverage version, run coverage + pytest (without
pytest-cov
) on my single test module, then print the time the command took. The catch here is that I'm adding the pandas source codebase via--source
, even though my test doesn't usepandas
and indeed the package isn't even installed. I've currently found that this addition makes the command take way longer. I'm doing this for the purposes of this bug report, as an attempt to make the bug "easily" reproducible, but the problem was first identified when running coverage on a proprietary codebase.Here's the script:
Here's the output with Coverage
4.5.3
:Here's the output with Coverage
5.3
:Expected behavior
I expected Coverage 5.3 to take as much time as 4.5.3 (half a second), but the command took 5 seconds.
Additional context
None that I can think of right now! Please let me know if more info is needed
The text was updated successfully, but these errors were encountered: