Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use pytest for testing #3617

Closed
agriyakhetarpal opened this issue Dec 13, 2023 · 29 comments
Closed

Use pytest for testing #3617

agriyakhetarpal opened this issue Dec 13, 2023 · 29 comments
Assignees
Labels
difficulty: medium Will take a few days feature priority: medium To be resolved if time allows

Comments

@agriyakhetarpal
Copy link
Member

agriyakhetarpal commented Dec 13, 2023

Description

I recently learned that pytest supports unittest-style tests out of the box and needs little to no extra configuration for setting up test commands and just a tad bit more configuration to use without any differences to how the tests have been written.

While rewriting the bulk of our tests in the tests/ folder and subdirectories to pytest ("Migrate to pytest", as it would be called) is a humongous task rife with tedium and should be reserved for potential long-term projects, I would say that we are definitely in a position to start using pytest itself.

I ran the unit tests on my macOS machine with eight cores locally with pytest-xdist installed and enabled, using the command

pytest tests/unit/

to run them in parallel, and it offered a staggering 2.5x speedup over python run-tests.py --unit, i.e., serial execution of tests!

Motivation

  1. pytest is a modern testing framework used by most packages in the Scientific Python ecosystem
  2. We are already using it with nbmake to test the example notebooks
  3. It simplifies testing a bit since we can let go of most of the code in the run-tests.py file and use the pytest built-in logging and path discovery functionalities to extract the most out of our testing
  4. Better CI times by a very big margin – unit tests and integration tests can both be completed in under ~10 minutes on GitHub Actions runners in separate jobs, which means we can add them to a single job one after the other as well (it reduces the strain on the runners and we wouldn't need to request so many runners like we do currently).
  5. Integration tests can be executed to test the built wheels for a Python version with cibuildwheel
  6. ...and so on

Possible Implementation

  • The conftest.py file can be used to configure the unittest.TestCase metaclass we currently use for ensuring the reliability of the test cases (see [Bug]: Some integration tests are non-deterministic #2833)
  • From @Saransh-cpp: pytest.importorskip can be used to test the optional dependencies
  • I noticed that some Jax solver test cases in the integration tests are causing some troubles when running in parallel mode, we can either reduce the distribution of the workers to run all tests in the same class in serial, or the better way mark just those tests with a certain decorator or @pytest.fixture that instructs them to be assigned to a single worker.
  • Other nitty-gritties and niceties such as pytest-cov and pytest-doctest/pytest-doctestplus or other plugins out of the hundreds of plugins can be looked into later as a part of the rewriting process

Additional context

N/A

@agriyakhetarpal agriyakhetarpal added feature difficulty: medium Will take a few days priority: medium To be resolved if time allows labels Dec 13, 2023
@Saransh-cpp
Copy link
Member

Quick suggestions:

pytest-cov and pytest-doctest

pytest-cov can be integrated with almost no extra config so I'll suggest including it in the same PR. For doctests, I'll suggest using xdoctest. xdoctest is the pytest equivalent of python's built-in doctest, and it can also be integrated with pytest.

@valentinsulzer
Copy link
Member

Sounds good, I am definitely in favor of a migration to pytest

@prady0t
Copy link
Contributor

prady0t commented Feb 1, 2024

I can try this

@agriyakhetarpal
Copy link
Member Author

agriyakhetarpal commented Feb 1, 2024

PSA: @cringeyburger had also expressed interest in trying this issue over DMs. I am happy to assign any and all people on this as they request to be assigned publicly, as long as they can collaborate on the requirements and reduce duplicated effort.

In short and off the top of my head, these are the overall requirements (inexhaustive, there might be more):

  • Start removing sections of run-tests.py and replace them in noxfile.py with pytest <...> commands
  • Configure pytest-cov in pyproject.toml similar to how coverage.py has been configured currently
  • Fix any failing unit tests that you come across with using pytest
  • Add xdoctest as a drop-in replacement, as mentioned above, and configure nox to work with it similar to the first point
  • Improve the contributor guide to reflect all these changes

N.B. I am not aware of the difficulties involved in all these requirements or individual difficulties for each task.

@cringeyburger
Copy link
Contributor

Thanks for the reminder, @agriyakhetarpal! I apologise for taking too long to express my interest in the issue publicly. I had to focus on my academics for a while. I am getting the hang of managing both, and I would like to resolve this issue!

@agriyakhetarpal
Copy link
Member Author

@prady0t and @cringeyburger, I have assigned both of you to this issue. Please feel free to collaborate with each other on this thread about taking up and setting divisions between tasks and for discussions amongst each other, plus amongst the maintainers as you feel necessary for questions and doubts.

@cringeyburger
Copy link
Contributor

I want a few clarifications:

  1. Start removing sections of run-tests.py and replace them in noxfile.py with pytest <...> commands
    Are we planning to remove run-tests.py? Because we can configure run-tests.py to run with pytest with a line or two (I ran nox -s unit after configuring pytest in run-tests.py)
    It worked, but a few tests failed.

  2. Should I add pytest and pytest-xdist in all dependencies instead of dev?

@agriyakhetarpal
Copy link
Member Author

I want a few clarifications:

  1. Start removing sections of run-tests.py and replace them in noxfile.py with pytest <...> commands
    Are we planning to remove run-tests.py? Because we can configure run-tests.py to run with pytest with a line or two (I ran nox -s unit after configuring pytest in run-tests.py)
    It worked, but a few tests failed.

  2. Should I add pytest and pytest-xdist in all dependencies instead of dev?

For point 1. I was initially thinking we can remove things like the parts that find the unit tests and related across the file globs and use pytest instead to do that via its discovery mechanism (it's faster and more reliable). We would want to remove run-tests.py entirely too but I would advise not doing so just now – it would make nox an explicit and external dependency for developers working with PyBaMM (nox -s unit requires nox to be installed, but python run-tests.py --unit does not).

Some test cases do fail on my end too, could you try fixing those?

For point 2: no, we shouldn't add pytest and pytest-xdist to [all] since the recommendation is to keep things like testing/development dependencies and documentation-related dependencies separate from the core dependencies intended for users. Could I know what you're trying to do that requires adding them to [all] at this time and we can explore having a workaround for that?

@cringeyburger
Copy link
Contributor

For point 1:
Okay, I have made a list of failed unit tests using pytest. I will try fixing them.

For point 2:
The main reason for this was that currently, when you run nox -s unit, unittest is built in python, so you don't have to -install- it as a package.
But that's not the case with pytest. For nox -s unit, the following dependencies are installed (depending on the system version info):

  1. [all]
  2. [jax]
  3. [odes]

As you can notice, none of these dependencies have pytest in them. Hence, I think we need to add pytest in one of the dependencies (I asked for pytest-xdist so that I can experiment with parallel testing in the future)

@agriyakhetarpal
Copy link
Member Author

The main reason for this was that currently, when you run nox -s unit, unittest is built in python, so you don't have to -install- it as a package.
But that's not the case with pytest. For nox -s unit, the following dependencies are installed (depending on the system version info):

  1. [all]
  2. [jax]
  3. [odes]

As you can notice, none of these dependencies have pytest in them. Hence, I think we need to add pytest in one of the dependencies (I asked for pytest-xdist so that I can experiment with parallel testing in the future)

Ah, thanks. You can just add the [dev] extra to the relevant sections in the file – I remember that this has been done in #3658, but it hasn't been merged yet. You can copy the changes in noxfile.py from there and that should work.

@abhicodes369
Copy link
Contributor

could you assign me

@agriyakhetarpal
Copy link
Member Author

@abhicodes369 please feel free to discuss the listed tasks with others assigned to this issue – the list is by no means exhaustive, BTW.

@abhicodes369
Copy link
Contributor

PSA: @cringeyburger had also expressed interest in trying this issue over DMs. I am happy to assign any and all people on this as they request to be assigned publicly, as long as they can collaborate on the requirements and reduce duplicated effort.

In short and off the top of my head, these are the overall requirements (inexhaustive, there might be more):

  • Start removing sections of run-tests.py and replace them in noxfile.py with pytest <...> commands
  • Configure pytest-cov in pyproject.toml similar to how coverage.py has been configured currently
  • Fix any failing unit tests that you come across with using pytest
  • Add xdoctest as a drop-in replacement, as mentioned above, and configure nox to work with it similar to the first point
  • Improve the contributor guide to reflect all these changes

N.B. I am not aware of the difficulties involved in all these requirements or individual difficulties for each task.

any updates on the issue, I did complete the first task (removing the code that contains unittest and adding equivalent code to noxfile.py) I ran tests and some of them are failing(very few ) . please feel free to discuss if you have any problems while solving the issue

@prady0t
Copy link
Contributor

prady0t commented Feb 28, 2024

@agriyakhetarpal
Would we still need interpreter parameter here:

def run_code_tests(executable=False, folder: str = "unit", interpreter="python"):

since with pytest we can simply run it via ["pytest", "-v", tests] instead of [interpreter, "-m", "unittest", "discover", "-v", tests] ?

@agriyakhetarpal
Copy link
Member Author

I just glanced at the code; the interpreter argument is just assigned to sys.executable – which is redundant IMO. This argument isn't overridden in any of the other functions, and one should not be allowed to override it either. I would say we can remove this argument (but keep the ([sys.executable, "-m", "pytest", "<...>"]) invocation, since it is possible on the off-hand that someone may not have pytest present on their PATH).

@prady0t
Copy link
Contributor

prady0t commented Feb 28, 2024

These are the failing tests btw:

FAILED tests/unit/test_expression_tree/test_unary_operators.py::TestUnaryOperators::test_to_equation - sympy.utilities.exceptions.SymPyDeprecationWarning: 
FAILED tests/unit/test_expression_tree/test_functions.py::TestFunction::test_to_equation - sympy.utilities.exceptions.SymPyDeprecationWarning: 
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_experiment_start_time_starting_solution - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_inputs - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_batch_study.py::TestBatchStudy::test_solve - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_logger.py::TestLogger::test_logger - AssertionError: 25 != 30
FAILED tests/unit/test_parameters/test_base_parameters.py::TestBaseParameters::test_getattr__ - UserWarning: Parameter 'cap_init' has been renamed to 'Q_init'
FAILED tests/unit/test_serialisation/test_serialisation.py::TestSerialise::test_save_experiment_model_error - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_run_experiment - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_solvers/test_processed_variable.py::TestProcessedVariable::test_processed_var_2D_fixed_t_interpolation - RuntimeWarning: invalid value encountered in divide
FAILED tests/unit/test_solvers/test_processed_variable.py::TestProcessedVariable::test_processed_var_2D_fixed_t_scikit_interpolation - RuntimeWarning: invalid value encountered in divide
FAILED tests/unit/test_simulation.py::TestSimulation::test_solve_with_initial_soc - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_util.py::TestUtil::test_have_optional_dependency - ImportError: Citations could not be registered. If you are on Google Colab - pybtex does not work with Google Colab due to a known bug - https://bitbucket.org/pybtex-devs/pybtex/issues/148/. Please manually cite all the referen...
FAILED tests/unit/test_solvers/test_solution.py::TestSolution::test_cycles - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_run_experiment_breaks_early_infeasible - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_run_experiment_multiple_times - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_run_experiment_with_pbar - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_save_at_cycles - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_skipped_step_continuous - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_starting_solution - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_solvers/test_jax_bdf_solver.py::TestJaxBDFSolver::test_solver_ - AssertionError: 73.22798387496732 not less than 61.262978165992536

@prady0t
Copy link
Contributor

prady0t commented Feb 28, 2024

UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah

Seems to be the recurring user warning for most of the failing test cases but I'm not sure what it means.

@valentinsulzer
Copy link
Member

You can ignore it, it shouldn't cause the tests to fail?

@agriyakhetarpal
Copy link
Member Author

I think what pytest does is that it converts UserWarnings to errors

@prady0t
Copy link
Contributor

prady0t commented Feb 29, 2024

There must be a way to check this. Let me look into it.

@Saransh-cpp
Copy link
Member

Looks like I missed a lot of discussion here 😬

But here is how you can filter warnings in pytest -

# pyproject.toml
[tool.pytest.ini_options]
filterwarnings = [
    "error",
    "ignore::DeprecationWarning",
    "ignore::UserWarning",
]

@prady0t prady0t mentioned this issue Mar 2, 2024
8 tasks
@prady0t
Copy link
Contributor

prady0t commented Mar 2, 2024

I've drafted a PR covering few tasks. Now working on adding xdoctest support.

@prady0t
Copy link
Contributor

prady0t commented Mar 2, 2024

@cringeyburger @abhicodes369 What do you think of this?

@abhicodes369
Copy link
Contributor

could you tell me what command is being used to execute the tests

@agriyakhetarpal
Copy link
Member Author

agriyakhetarpal commented Mar 2, 2024

@abhicodes369, we are using noxfile.py to execute the tests with nox, which uses the run-tests.py file internally. The unittest commands should be replaced by pytest commands in the latter.

@prady0t, when you work on adding xdoctest, could you include that in a separate PR so that it would be easy to review?

@prady0t
Copy link
Contributor

prady0t commented Mar 2, 2024

Sure!

@prady0t
Copy link
Contributor

prady0t commented Mar 29, 2024

Should I open an Issue to add support for xdoctest ?

@agriyakhetarpal
Copy link
Member Author

Sure, in case @abhicodes369 is not working on something similar already – please go ahead and do that.

@agriyakhetarpal
Copy link
Member Author

Closing this because this is now more or less complete; pytest is now being used to run all the unit and integration tests, and pytest-doctestplus plus pytest-cov have been configured. The status of the migration of the tests themselves to pytest-style syntax plus documentation and any ancillary tasks can be viewed in #4156 and is being tracked through the weekly meetings as a part of the sixteen-week GSoC coding period.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
difficulty: medium Will take a few days feature priority: medium To be resolved if time allows
Projects
None yet
Development

No branches or pull requests

6 participants