Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support sensitivities for Experiments #4415

Merged
merged 32 commits into from
Sep 13, 2024
Merged

Support sensitivities for Experiments #4415

merged 32 commits into from
Sep 13, 2024

Conversation

martinjrobins
Copy link
Contributor

@martinjrobins martinjrobins commented Sep 4, 2024

Description

Support sensitivity calculation for pybamm.Simulation, including experiments

Fixes #3834

Note that although the sensitivity calculation for pybamm.ScipySolver appears to work, it seems like there are bugs in this solver for experiments (see #4429) so this is currently not tested.

Type of change

Please add a line in the relevant section of CHANGELOG.md to document the change (include PR #) - note reverse order of PR #s. If necessary, also add to the list of breaking changes.

  • New feature (non-breaking change which adds functionality)

Key checklist:

  • No style issues: $ pre-commit run (or $ nox -s pre-commit) (see CONTRIBUTING.md for how to set this up to run automatically when committing locally, in just two lines of code)
  • All tests pass: $ python run-tests.py --all (or $ nox -s tests)
  • The documentation builds: $ python run-tests.py --doctest (or $ nox -s doctests)

You can run integration tests, unit tests, and doctests together at once, using $ python run-tests.py --quick (or $ nox -s quick).

Further checks:

  • Code is commented, particularly in hard-to-understand areas
  • Tests added that prove fix is effective or that feature works

Copy link

codecov bot commented Sep 6, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 99.42%. Comparing base (7f263e4) to head (2e06da1).
Report is 184 commits behind head on develop.

Additional details and impacted files
@@            Coverage Diff            @@
##           develop    #4415    +/-   ##
=========================================
  Coverage    99.41%   99.42%            
=========================================
  Files          292      292            
  Lines        22223    22337   +114     
=========================================
+ Hits         22093    22208   +115     
+ Misses         130      129     -1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@martinjrobins martinjrobins marked this pull request as ready for review September 9, 2024 12:30
@martinjrobins martinjrobins changed the title Support sensitivities for Simulations Support sensitivities for Experiments Sep 9, 2024
Copy link
Member

@brosaplanella brosaplanella left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me! Asking @MarcBerliner to review the PR too, as he knows a lot more about this than me.

CHANGELOG.md Outdated Show resolved Hide resolved
Copy link
Member

@MarcBerliner MarcBerliner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Martin! This is great, just a few minor comments on the code.

There are some experimental setups where a linearized sensitivity analysis might produce inaccurate results. Sequential experiment steps with a fixed end time will always work correctly. However, sequential steps with event-based termination conditions (like max voltage) may give the correct answer since the end time of each step is indeterminate and may change wrt the parameters. The only situation where an experiment step with an event-based termination is valid is if it’s the final step in the experiment. It would be nice to add a check for this in the code.

src/pybamm/simulation.py Outdated Show resolved Hide resolved
tests/unit/test_solvers/test_casadi_solver.py Outdated Show resolved Hide resolved
@martinjrobins
Copy link
Contributor Author

There are some experimental setups where a linearized sensitivity analysis might produce inaccurate results. Sequential experiment steps with a fixed end time will always work correctly. However, sequential steps with event-based termination conditions (like max voltage) may give the correct answer since the end time of each step is indeterminate and may change wrt the parameters. The only situation where an experiment step with an event-based termination is valid is if it’s the final step in the experiment. It would be nice to add a check for this in the code.

Yea, its a good point. For event-based termination conditions there would be an additional term in the sensitivity equations which we are assuming is close to zero. In our particular case where the model changes over the event I'm not sure if this term can even be defined, although it would be interesting to look into this further. I'd be reluctant to make this an error, and to allow users to still use this for event-based terminations. As long as the additional term is small the sensitivities can still be used within, for example, an optimisation algorithm that is robust to errors in the gradient (which many are). I'll put something in the docstrings to warn users about this, and I'll do a run-time check and emit a warning if the experiment contains an event-based termination, how does that sound?

@MarcBerliner
Copy link
Member

I'll put something in the docstrings to warn users about this, and I'll do a run-time check and emit a warning if the experiment contains an event-based termination, how does that sound?

Sounds good to me.

@martinjrobins martinjrobins merged commit 35bcb78 into develop Sep 13, 2024
26 checks passed
@martinjrobins martinjrobins deleted the i3834-sens-exp branch September 13, 2024 15:44
@itsjacobhere
Copy link

This feature would enable gradient computation while using pybamm.Experiment to describe a current profile right? Has this been implemented yet? Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

generalise simulation, include sensititivies
5 participants