-
-
Notifications
You must be signed in to change notification settings - Fork 553
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use pytest
for testing
#3617
Comments
Quick suggestions:
|
Sounds good, I am definitely in favor of a migration to pytest |
I can try this |
PSA: @cringeyburger had also expressed interest in trying this issue over DMs. I am happy to assign any and all people on this as they request to be assigned publicly, as long as they can collaborate on the requirements and reduce duplicated effort. In short and off the top of my head, these are the overall requirements (inexhaustive, there might be more):
N.B. I am not aware of the difficulties involved in all these requirements or individual difficulties for each task. |
Thanks for the reminder, @agriyakhetarpal! I apologise for taking too long to express my interest in the issue publicly. I had to focus on my academics for a while. I am getting the hang of managing both, and I would like to resolve this issue! |
@prady0t and @cringeyburger, I have assigned both of you to this issue. Please feel free to collaborate with each other on this thread about taking up and setting divisions between tasks and for discussions amongst each other, plus amongst the maintainers as you feel necessary for questions and doubts. |
I want a few clarifications:
|
For point 1. I was initially thinking we can remove things like the parts that find the unit tests and related across the file globs and use Some test cases do fail on my end too, could you try fixing those? For point 2: no, we shouldn't add |
For point 1: For point 2:
As you can notice, none of these dependencies have |
Ah, thanks. You can just add the |
could you assign me |
@abhicodes369 please feel free to discuss the listed tasks with others assigned to this issue – the list is by no means exhaustive, BTW. |
any updates on the issue, I did complete the first task (removing the code that contains unittest and adding equivalent code to noxfile.py) I ran tests and some of them are failing(very few ) . please feel free to discuss if you have any problems while solving the issue |
@agriyakhetarpal Line 17 in 94aa498
since with pytest we can simply run it via ["pytest", "-v", tests] instead of [interpreter, "-m", "unittest", "discover", "-v", tests] ?
|
I just glanced at the code; the |
These are the failing tests btw: FAILED tests/unit/test_expression_tree/test_unary_operators.py::TestUnaryOperators::test_to_equation - sympy.utilities.exceptions.SymPyDeprecationWarning:
FAILED tests/unit/test_expression_tree/test_functions.py::TestFunction::test_to_equation - sympy.utilities.exceptions.SymPyDeprecationWarning:
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_experiment_start_time_starting_solution - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_inputs - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_batch_study.py::TestBatchStudy::test_solve - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_logger.py::TestLogger::test_logger - AssertionError: 25 != 30
FAILED tests/unit/test_parameters/test_base_parameters.py::TestBaseParameters::test_getattr__ - UserWarning: Parameter 'cap_init' has been renamed to 'Q_init'
FAILED tests/unit/test_serialisation/test_serialisation.py::TestSerialise::test_save_experiment_model_error - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_run_experiment - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_solvers/test_processed_variable.py::TestProcessedVariable::test_processed_var_2D_fixed_t_interpolation - RuntimeWarning: invalid value encountered in divide
FAILED tests/unit/test_solvers/test_processed_variable.py::TestProcessedVariable::test_processed_var_2D_fixed_t_scikit_interpolation - RuntimeWarning: invalid value encountered in divide
FAILED tests/unit/test_simulation.py::TestSimulation::test_solve_with_initial_soc - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_util.py::TestUtil::test_have_optional_dependency - ImportError: Citations could not be registered. If you are on Google Colab - pybtex does not work with Google Colab due to a known bug - https://bitbucket.org/pybtex-devs/pybtex/issues/148/. Please manually cite all the referen...
FAILED tests/unit/test_solvers/test_solution.py::TestSolution::test_cycles - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_run_experiment_breaks_early_infeasible - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_run_experiment_multiple_times - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_run_experiment_with_pbar - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_save_at_cycles - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_skipped_step_continuous - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_experiments/test_simulation_with_experiment.py::TestSimulationExperiment::test_starting_solution - UserWarning: Q_Li=2.0793 Ah is greater than Q_p=1.9464 Ah.
FAILED tests/unit/test_solvers/test_jax_bdf_solver.py::TestJaxBDFSolver::test_solver_ - AssertionError: 73.22798387496732 not less than 61.262978165992536 |
Seems to be the recurring user warning for most of the failing test cases but I'm not sure what it means. |
You can ignore it, it shouldn't cause the tests to fail? |
I think what |
There must be a way to check this. Let me look into it. |
Looks like I missed a lot of discussion here 😬 But here is how you can filter warnings in pytest - # pyproject.toml
[tool.pytest.ini_options]
filterwarnings = [
"error",
"ignore::DeprecationWarning",
"ignore::UserWarning",
] |
I've drafted a PR covering few tasks. Now working on adding |
@cringeyburger @abhicodes369 What do you think of this? |
could you tell me what command is being used to execute the tests |
@abhicodes369, we are using @prady0t, when you work on adding |
Sure! |
Should I open an Issue to add support for |
Sure, in case @abhicodes369 is not working on something similar already – please go ahead and do that. |
Closing this because this is now more or less complete; |
Description
I recently learned that
pytest
supportsunittest
-style tests out of the box and needs little to no extra configuration for setting up test commands and just a tad bit more configuration to use without any differences to how the tests have been written.While rewriting the bulk of our tests in the
tests/
folder and subdirectories topytest
("Migrate topytest
", as it would be called) is a humongous task rife with tedium and should be reserved for potential long-term projects, I would say that we are definitely in a position to start usingpytest
itself.I ran the unit tests on my macOS machine with eight cores locally with
pytest-xdist
installed and enabled, using the commandto run them in parallel, and it offered a staggering
2.5x
speedup overpython run-tests.py --unit
, i.e., serial execution of tests!Motivation
pytest
is a modern testing framework used by most packages in the Scientific Python ecosystemnbmake
to test the example notebooksrun-tests.py
file and use thepytest
built-in logging and path discovery functionalities to extract the most out of our testingcibuildwheel
Possible Implementation
conftest.py
file can be used to configure theunittest.TestCase
metaclass we currently use for ensuring the reliability of the test cases (see [Bug]: Some integration tests are non-deterministic #2833)pytest.importorskip
can be used to test the optional dependencies@pytest.fixture
that instructs them to be assigned to a single worker.pytest-cov
andpytest-doctest
/pytest-doctestplus
or other plugins out of the hundreds of plugins can be looked into later as a part of the rewriting processAdditional context
N/A
The text was updated successfully, but these errors were encountered: