-
Notifications
You must be signed in to change notification settings - Fork 3
Testing
All tests are written in pytest.
End-to-end tests are written with the playwright plugin of pytest.
The end-to-end tests use pytest-django to set up a live server and
provide a test-database that is cleaned after each test.
In order to automatically test the docker setup, there is also a smoke test in e2e_tests/test_deployed
that requires starting the application with the local
configuration and creating the database with migrations applied and e2e_tests/database/test_database.json
loaded.
The first time playwright is used, let it download the tools it needs with:
playwright install
Run all tests (except the smoke test that requires a running server):
pytest --ignore e2e_tests/test_deployed.py
Run a single test (add --headed
to view what's happening)
pytest <path-to-test>
Run the smoke test (and set the database up with example data beforehand)
rm db/db.sqlite3
rm cpmonitor/images/uploads-backup-migration-0029 -r
mkdir cpmonitor/images/uploads/local_groups
poetry run python manage.py migrate --settings=config.settings.local
poetry run python manage.py loaddata --settings=config.settings.local e2e_tests/database/test_database.json
cp -r e2e_tests/database/test_database_uploads/. cpmonitor/images/uploads
docker network create testing_nginx_network
docker network create production_nginx_network
docker-compose up -d --build
docker-compose -f docker/reverseproxy/docker-compose.yml up -d --build
pytest e2e_tests/test_deployed.py
docker-compose -f docker/reverseproxy/docker-compose.yml down --volumes
docker-compose down --volumes
- New test files have to be named according to the convention:
*_test.py
. - Test names should follow the convention:
test_should_do_x_when_given_y
.
From a local database filled with suitable data, generate a fixture named example_fixture
with
python -Xutf8 manage.py dumpdata -e contenttypes -e auth.Permission -e admin.LogEntry -e sessions --indent 2 --settings=config.settings.local > cpmonitor/fixtures/example_fixture.json
(The -Xutf8
and --indent 2
options ensure consistent and readable output on all platforms.)
The arguments -e contenttypes -e auth.Permission -e admin.LogEntry -e sessions
exclude tables which are pre-filled
by django or during usage by django and whose content may change depending on the models in the project. If they are
included, everything works fine at first, since loaddata will silently accept data already there. However, as soon as
the data to load clashes with existing content, it will fail. -e admin.LogEntry
excludes references to content types
which may otherwise be inconsistent.-e sessions
excludes unneeded data which otherwise would clog the JSON file.
This fixture may be loaded in a test with. (Similar in a pytest fixture.)
@pytest.mark.django_db
def test_something(django_db_blocker):
with django_db_blocker.unblock():
call_command("loaddata", "example_fixture")
# Test something here
This does not work when testing migrations, but there is a way: Use read_fixture
in cpmonitor/tests/migrations_test.py
.
To start the testing app with production data from some backup run the script
/home/monitoring/start-testing-with-prod-data.sh [backup-folder]
rm db/db.sqlite3
scp lzm:testing/db/db.sqlite3 db/
rm -r cpmonitor/images/uploads
scp -r lzm:testing/cpmonitor/images/uploads cpmonitor/images/
To find out on which migration version this database is based use:
ssh -tt lzm docker exec -it djangoapp-testing python manage.py showmigrations --settings=config.settings.container
# or
ssh -tt lzm docker exec -it djangoapp-production python manage.py showmigrations --settings=config.settings.container
Possibly migrate, test the data, and check that the size is reasonable. Then make it available to others with:
SNAPSHOT_NAME=prod_database_$(date -u +"%FT%H%M%SZ")
python -Xutf8 manage.py dumpdata -e contenttypes -e auth.Permission -e admin.LogEntry -e sessions --indent 2 --settings=config.settings.local > e2e_tests/database/${SNAPSHOT_NAME}.json
cp -r cpmonitor/images/uploads e2e_tests/database/${SNAPSHOT_NAME}_uploads
echo "Some useful information, e.g. the migration state of the snapshot" > e2e_tests/database/${SNAPSHOT_NAME}.README
du -hs e2e_tests/database/${SNAPSHOT_NAME}*
Commit the result.
# select the snapshot to use
SNAPSHOT_NAME=prod_database_<some date found in e2e_tests/database/>
# remove previous data
rm db/db.sqlite3
rm -r cpmonitor/images/uploads
# create the database
python manage.py migrate --settings=config.settings.local
# optionally migrate back to a suitable version (see the .README file corresponding to the snapshot you're about to load):
python manage.py migrate cpmonitor <some-earlier-migration> --settings=config.settings.local
python manage.py loaddata --settings=config.settings.local e2e_tests/database/${SNAPSHOT_NAME}.json
cp -r e2e_tests/database/${SNAPSHOT_NAME}_uploads cpmonitor/images/uploads
If the snapshot you want to use is based on an older model version, migrations have to be applied and are tested:
python manage.py migrate --settings=config.settings.local
The E2E tests will most likely fail, since they are based on another DB dump, e.g. with other password settings. But manual tests with the dev server or container-based tests should be possible and the images should be visible:
python manage.py runserver --settings=config.settings.local
#or
docker compose --env-file .env.local up --detach --build