-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use of xfail in conftest.py hiding passing conditions #1729
Comments
matthewfeickert
added
bug
Something isn't working
tests
pytest
needs-triage
Needs a maintainer to categorize and assign
labels
Dec 8, 2021
@kratsg If you can take a look at this with me it would be helpful for making sure the tests are working as expected, especially for the lower bounds constraints tests. |
matthewfeickert
removed
the
needs-triage
Needs a maintainer to categorize and assign
label
Dec 8, 2021
Okay this might actually be pretty straightforward to add. The following $ git diff
diff --git a/pyproject.toml b/pyproject.toml
index 9c81dc60..6616d586 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -38,11 +38,9 @@ ignore = [
'AUTHORS',
]
-[tool.pytest]
-xfail_strict = true
-
[tool.pytest.ini_options]
minversion = "6.0"
+xfail_strict = true
addopts = [
"--ignore=setup.py",
"--ignore=validation/",
diff --git a/tests/conftest.py b/tests/conftest.py
index 0e2537b7..bed0f9ae 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -99,7 +99,10 @@ def backend(request):
)
if fail_backend:
- pytest.xfail(f"expect {func_name} to fail as specified")
+ request.node.add_marker(
+ pytest.mark.xfail(reason=f"expect {func_name} to fail as specified")
+ )
+ # print(request.node.own_markers) # This is for debug purposes only
# actual execution here, after all checks is done
pyhf.set_backend(*request.param) works as expected for the following scenarios:
$ cat /tmp/old-jax-requirements.txt
jax==0.2.25
jaxlib==0.1.74
$ python -m pip install -r /tmp/old-jax-requirements.txt
$ cat /tmp/new-jax-requirements.txt
jax==0.2.26
jaxlib==0.1.75
$ python -m pip install -r /tmp/new-jax-requirements.txt
$ pytest -sx tests/test_tensor.py -k test_percentile
=========================================================================================== test session starts ===========================================================================================
platform linux -- Python 3.9.6, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
Matplotlib: 3.5.0
Freetype: 2.6.1
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /home/feickert/Code/GitHub/pyhf, configfile: pyproject.toml
plugins: mock-3.6.1, console-scripts-1.2.1, mpl-0.13, requests-mock-1.9.3, benchmark-3.4.1, cov-3.0.0, anyio-3.3.3
collected 222 items / 198 deselected / 24 selected
tests/test_tensor.py ....F
================================================================================================ FAILURES =================================================================================================
__________________________________________________________________________________________ test_percentile[jax] ___________________________________________________________________________________________
[XPASS(strict)] expect test_percentile[jax] to fail as specified |
4 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Summary
In PR #1702 we adopted using
xfail_strict = true
forpytest
to become aware of when tests that should be failing aren't. With the release ofjax
v0.2.26
(which includes a fix for jax-ml/jax#8513 — c.f. PR #817)pyhf/tests/test_tensor.py
Lines 379 to 396 in 43c1567
should be failing given
xfail_strict
asjax
v0.2.26
should be passing here. However, they don't fail.This is because in
conftest.py
we usepytest.xfail
pyhf/tests/conftest.py
Lines 101 to 102 in 43c1567
and according to the xfail tutorial
so the use of
pytest.xfail
inconftest.py
is actually skipping tests instead of running them and checking for failure (which is a surprise to me!). Thepytest.xfail
docs note thatNote also that
pyhf/pyproject.toml
Lines 41 to 42 in 43c1567
should actually be under
[tool.pytest.ini_options]
as noted in thepytest
docs (this was done wrong in PR #1702).OS / Environment
All
Steps to Reproduce
test_percentile[numpy]
should pass, but use ofpytest.xfail
skips it:Expected Results
For tests marked with
xfail
to fail:Actual Results
pyhf Version
master
at commit 43c1567Code of Conduct
The text was updated successfully, but these errors were encountered: