-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test: Consolidate and update pytest options in pyproject.toml #1773
test: Consolidate and update pytest options in pyproject.toml #1773
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1773 +/- ##
=======================================
Coverage 98.12% 98.12%
=======================================
Files 64 64
Lines 4270 4270
Branches 683 683
=======================================
Hits 4190 4190
Misses 46 46
Partials 34 34
Flags with carried forward coverage won't be shown. Click here to find out more. Continue to review full report at Codecov.
|
https://mail.python.org/pipermail/python-dev/2010-April/099116.html > -Wd enables all warnings. It adds 'd' to sys.warnoptions, which in turn > adds a new first entry to _warnings.filters which matches all warnings and > enables the "default" behavior, which is to show it once per execution of > the Python interpreter. pytest options: > reporting: > ... > -r chars show extra test summary info as specified by chars: > (f)ailed, (E)rror, (s)kipped, (x)failed, (X)passed, (p)assed, (P)assed with > output, (a)ll except passed (p/P), or (A)ll. (w)arnings are enabled by default > (see --disable-warnings), 'N' can be used to reset the list. (default: 'fE').
352a702
to
2b71f9d
Compare
@kratsg @lukasheinrich Can you also run some pytest runs locally with this PR to verify that you're happy with how the new options render output now? |
I'd swap " |
Thanks for this recommendation @henryiii. Adding this has been quite insightful and is going to lead to a lot of Issues being opened! |
|
@henryiii one question. For the
workflow where we test the API against the package versions in Do you know of any way to disable
I'm a bit worried it isn't possible as
Is there any less aggressive solution than just ripping it out? sed -i '/"error",/d' pyproject.toml
pytest --ignore tests/benchmarks/ --ignore tests/contrib --ignore tests/test_notebooks.py tests/ |
7dee136
to
e8d1351
Compare
After reading
I see that --override-ini filterwarnings= works. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand the motivation behind filterwarnings = ["error"]
, I just wonder whether it is worth the extra maintenance burden to keep the list of ignores up-to date. Maybe I am overestimating the effort though, given that it seems that (at least right now) there are not a lot of things that are only due to the dependencies.
Very minor point: is there a good way to clean up the warning ignore
settings periodically or automatically (if they are not used)? Assuming that something like the true_divide:RuntimeWarning
would be fixed by re-designing a test (maybe even accidentally), it would probably be good to also remove that entry. When a new test will be designed in the future, the warning could be caught again and perhaps be avoided directly by re-designing the new test.
@alexander-held That is correct that there are quite a few things that either would get their own issue to get fixed later, like changing all instances of json.load(open(datadir.join("abc.json"))) to with open(datadir.join("abc.json")) as spec_file:
spec = json.load(spec_file) to avoid things like pytest.PytestUnraisableExceptionWarning:Exception ignored in: <_io.FileIO [closed]> and some of the others might be able to be dealt with, but some might not be possible to deal with easily because of the nature of hitting walls at times.
No, which is one of the reasons why ideally as many of these with pytest.warns(
OptimizeWarning, match="Unknown solver options: arbitrary_option"
):
assert pyhf.infer.mle.fit(data, model).tolist() |
@kratsg @lukasheinrich I'm going to merge this to keep things going, but if there are any parts of this that you have questions on or want reverted we can iterate here or in an new Issue. 👍 |
Description
Resolves #1685
Remove
.coveragerc
and consolidate pytest options topyproject.toml
(this results in--cov-branch
getting added.) In doing so, also apply the Configuring pytest recommendations from Scikit-HEP that Henry wrote.-ra
includes a report after pytest runs with a summary on all tests except those that passed. Frompytest --help
:-r sx
in all the CI jobs.-Wd
enables all warnings.> It adds 'd' to sys.warnoptions, which in turn adds a new first entry to _warnings.filters which matches all warnings and enables the "default" behavior, which is to show it once per execution of the Python interpreter.--showlocal
prints locals in tracebacks.--strict-markers
will complain if you use an unspecified fixture.--strict-config
will raise an error if there is a mistake in the pytest config.log_cli_level = "info"
reportsINFO
and above log messages on a failure.filterwarnings = ["error"]
sets all warnings to be errors and allows for some warnings to be ignored with-W
warn control syntax.tests/__init__.py
as "there’s not often a reason to make the test directory importable, and it can confuse package discovery algorithms."To deal with the fact that the minimum supported dependencies workflow is using intentionally older releases that will have different behaviors and warnings then the rest of the tests, use the
--override-ini
pytest option to override thefilterwarnings
option with an empty list for that workflowthough also set warning control to
'default'
to see the warnings.Also remove the now unused nbQA options for black. (Amends PR #1598)
(Thanks to @alexander-held for reminding me about branch coverage in general.)
Checklist Before Requesting Reviewer
Before Merging
For the PR Assignees: