Skip to content

Commit

Permalink
Add a check for tests which are always skipped (#240)
Browse files Browse the repository at this point in the history
* Add a check for tests which are always skipped

This takes the form of a new script which crawls pytest XML reports
and collates them into a single aggregate. It checks for tests which
are skipped or missing in all of the reports.

The aggregator can be run along with a suite of tox environments via
`make collatd-test-report`, and a new CI job runs this in a build.

* Don't run tests twice in CI

As a first draft, tests ran twice (one for the "test" and once for the
skipped test collator). Switch to upload/download of junitxml report
data to pass the reports from the matrix to the collator.
  • Loading branch information
sirosen authored Feb 8, 2023
1 parent 5c71afa commit d2b5f95
Show file tree
Hide file tree
Showing 4 changed files with 73 additions and 3 deletions.
23 changes: 22 additions & 1 deletion .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -75,9 +75,30 @@ jobs:
- name: test
run: |
python -m tox run-parallel -m ci
python -m tox run-parallel -m ci -- --junitxml pytest.{envname}.xml
python -m tox run -e cov
- uses: actions/upload-artifact@v3
with:
name: pytest-report-py${{ matrix.py }}-${{ matrix.os }}
path: pytest.*.xml

collate-tests:
needs: [ci-test-matrix]
runs-on: ubuntu-latest
name: "Collate results to check for skipped tests"
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.x"
# download everything
- uses: actions/download-artifact@v3
with:
path: artifacts/
# collate and report
- run: python ./scripts/aggregate-pytest-reports.py artifacts/*/pytest.*.xml

self-check:
name: "Self-Check"
runs-on: ubuntu-latest
Expand Down
5 changes: 5 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,11 @@ release:
-git push $(shell git rev-parse --abbrev-ref @{push} | cut -d '/' -f1) refs/tags/$(PKG_VERSION)
tox run -e publish-release

.PHONY: collated-test-report
collated-test-report:
tox p
python ./scripts/aggregate-pytest-reports.py .tox/*/pytest.xml

.PHONY: clean
clean:
rm -rf dist build *.egg-info .tox .coverage.*
43 changes: 43 additions & 0 deletions scripts/aggregate-pytest-reports.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
import argparse
import sys
from collections import defaultdict
from xml.etree import ElementTree # nosec


def main():
parser = argparse.ArgumentParser()
parser.add_argument("FILES", nargs="+")
args = parser.parse_args()

tests_by_name = defaultdict(dict)
for filename in args.FILES:
tree = ElementTree.parse(filename)
root = tree.getroot()

for testcase in root.findall("./testsuite/testcase"):
classname = testcase.get("classname")
name = testcase.get("name")
nodename = f"{classname.replace('.', '/')}.py::{name}"

skip_node = testcase.find("skipped")
if skip_node is not None:
if "skipped" not in tests_by_name[nodename]:
tests_by_name[nodename]["skipped"] = True
else:
tests_by_name[nodename]["skipped"] = False

fail = False
for nodename, attributes in tests_by_name.items():
if attributes.get("skipped") is True:
print(f"ALWAYS SKIPPED: {nodename}")
fail = True

if fail:
print("fail")
sys.exit(1)
print("ok")
sys.exit(0)


if __name__ == "__main__":
main()
5 changes: 3 additions & 2 deletions tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ skip_missing_interpreters = true
minversion = 4.0.0

labels =
ci = py-notoml, py-tomli-format, py-json5, py-pyjson5
ci = py, py-notoml, py-tomli-format, py-json5, py-pyjson5

[testenv]
description = "run tests with pytest"
Expand All @@ -28,7 +28,8 @@ deps =
format: jsonschema[format]
set_env =
notoml: FORCE_TOML_DISABLED=1
commands = coverage run -m pytest {posargs}
commands =
coverage run -m pytest {posargs:--junitxml={envdir}/pytest.xml}

[testenv:cov_clean]
description = "erase coverage data to prepare for a new run"
Expand Down

0 comments on commit d2b5f95

Please sign in to comment.