Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pass kwargs to SciPy optimisers #236

Closed
BradyPlanden opened this issue Mar 13, 2024 · 0 comments · Fixed by #255
Closed

Pass kwargs to SciPy optimisers #236

BradyPlanden opened this issue Mar 13, 2024 · 0 comments · Fixed by #255
Assignees
Labels
enhancement New feature or request

Comments

@BradyPlanden
Copy link
Member

Feature description

At the moment, the SciPy optimisers aren't passed kwargs. As such, hyper parameter tuning is not available to the end user. This issue is to update the SciPy implementation to enable optimiser method, options, etc. changes via kwargs.

Motivation

End users should be able to modify the SciPy optimisers, links to #195.

Possible implementation

No response

Additional context

No response

@BradyPlanden BradyPlanden added the enhancement New feature or request label Mar 13, 2024
@BradyPlanden BradyPlanden moved this to Todo in v24.6 Mar 21, 2024
@BradyPlanden BradyPlanden moved this from Todo to In Progress in v24.6 Apr 10, 2024
NicolaCourtier added a commit that referenced this issue May 22, 2024
* Splits Optimisation -> BaseOptimiser/Optimisation, enables two optimisation APIs, updts where required, moves _optimisation to optimisers/

* increase coverage

* Pass optimiser_kwargs though run()

* updt examples

* Converts DefaultOptimiser -> Optimisation

* split Optimisation and BaseOptimsier classes, loosen standalone cost unit test

* add incorrect attr test

* fix: updt changelog entry, optimsation_interface notebook, review suggestions

* fix: updt notebook state

* Updt assertions, optimisation object name

---------

Co-authored-by: NicolaCourtier <[email protected]>
NicolaCourtier added a commit that referenced this issue May 22, 2024
* Enable passing of kwargs to SciPy

* Update checks on bounds

* Add update_options and error for Pints

* Rename opt to optim

* Remove stray comments

* Add BaseSciPyOptimiser and BasePintsOptimiser

* Align optimiser option setting

* Update notebooks with kwargs

* Update scripts with kwargs

* Update notebooks

* Align optimisers with Optimisation as base class

* Update stopping criteria in spm_NelderMead.py

Co-authored-by: Brady Planden <[email protected]>

* Update stopping criteria in spm_adam.py

Co-authored-by: Brady Planden <[email protected]>

* Update sigma0 in spm_descent.py

Co-authored-by: Brady Planden <[email protected]>

* Update GradientDescent

* Change update to set and check pints_method

* Update test_optimisation_options

* Update notebooks

* Update set learning rate

* Pop threshold

* Fix bug in model.simulate

* Update notebooks

* Update test_models.py

* Store SciPy result

* Update x0 input and add tests

* Update bounds to avoid x0 outside

* Re-initialise pints_method on certain options

* Update x0_new test

* Update test_optimisation.py

* Create initialise_method for PINTS optimisers

* Align optimisation result

* Update checks on bounds

* Apply suggestions

Co-authored-by: Brady Planden <[email protected]>

* Add standalone optimiser

* Simplify optimiser set-up and align _minimising

* Update option setting in notebooks

* Take abs of cost0

* Implement suggestions from Brady

* Update tests and base option setting

* Update test_invalid_cost

* Increase coverage

* Sort out notebook changes

* Reset scale parameter

* Move settings into arguments

* Update comments

* Update optimiser call

* Move check on jac

* Add assertions

* Add maxiter to test

* Add assertion

* Update to lambda functions

Co-authored-by: Brady Planden <[email protected]>

* Update comment

Co-authored-by: Brady Planden <[email protected]>

* Update to list comprehension

Co-authored-by: Brady Planden <[email protected]>

* Formatting

* Revert "Update to lambda functions"

This reverts commit aa73bff.

* Move minimising out of costs

* Update description

* Updates to #236 to avoid breaking change to `pybop.Optimisation` (#309)

* Splits Optimisation -> BaseOptimiser/Optimisation, enables two optimisation APIs, updts where required, moves _optimisation to optimisers/

* increase coverage

* Pass optimiser_kwargs though run()

* updt examples

* Converts DefaultOptimiser -> Optimisation

* split Optimisation and BaseOptimsier classes, loosen standalone cost unit test

* add incorrect attr test

* fix: updt changelog entry, optimsation_interface notebook, review suggestions

* fix: updt notebook state

* Updt assertions, optimisation object name

---------

Co-authored-by: NicolaCourtier <[email protected]>

* Rename method to pints_optimiser

* Rename base_optimiser to pints_base_optimiser

* Rename _optimisation to base_optimiser

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Brady Planden <[email protected]>
@github-project-automation github-project-automation bot moved this from In Progress to Done in v24.6 May 22, 2024
BradyPlanden added a commit that referenced this issue Jun 3, 2024
* Enable passing of kwargs to SciPy

* Update checks on bounds

* Add update_options and error for Pints

* Rename opt to optim

* Remove stray comments

* Add BaseSciPyOptimiser and BasePintsOptimiser

* Align optimiser option setting

* Update notebooks with kwargs

* Update scripts with kwargs

* Update notebooks

* Align optimisers with Optimisation as base class

* Update stopping criteria in spm_NelderMead.py

Co-authored-by: Brady Planden <[email protected]>

* Update stopping criteria in spm_adam.py

Co-authored-by: Brady Planden <[email protected]>

* Update sigma0 in spm_descent.py

Co-authored-by: Brady Planden <[email protected]>

* Update GradientDescent

* Change update to set and check pints_method

* Update test_optimisation_options

* Update notebooks

* Update set learning rate

* Pop threshold

* Fix bug in model.simulate

* Update notebooks

* Update test_models.py

* Store SciPy result

* Update x0 input and add tests

* Update bounds to avoid x0 outside

* Re-initialise pints_method on certain options

* Update x0_new test

* Update test_optimisation.py

* Create initialise_method for PINTS optimisers

* Align optimisation result

* Update checks on bounds

* Apply suggestions

Co-authored-by: Brady Planden <[email protected]>

* Add standalone optimiser

* Simplify optimiser set-up and align _minimising

* Update option setting in notebooks

* Take abs of cost0

* Implement suggestions from Brady

* Update tests and base option setting

* Update test_invalid_cost

* Increase coverage

* Sort out notebook changes

* Reset scale parameter

* Move settings into arguments

* Update comments

* Update optimiser call

* Move check on jac

* Add assertions

* Add maxiter to test

* Add assertion

* Update to lambda functions

Co-authored-by: Brady Planden <[email protected]>

* Update comment

Co-authored-by: Brady Planden <[email protected]>

* Update to list comprehension

Co-authored-by: Brady Planden <[email protected]>

* Formatting

* Revert "Update to lambda functions"

This reverts commit aa73bff.

* Move minimising out of costs

* Update description

* Updates to #236 to avoid breaking change to `pybop.Optimisation` (#309)

* Splits Optimisation -> BaseOptimiser/Optimisation, enables two optimisation APIs, updts where required, moves _optimisation to optimisers/

* increase coverage

* Pass optimiser_kwargs though run()

* updt examples

* Converts DefaultOptimiser -> Optimisation

* split Optimisation and BaseOptimsier classes, loosen standalone cost unit test

* add incorrect attr test

* fix: updt changelog entry, optimsation_interface notebook, review suggestions

* fix: updt notebook state

* Updt assertions, optimisation object name

---------

Co-authored-by: NicolaCourtier <[email protected]>

* Rename method to pints_optimiser

* Rename base_optimiser to pints_base_optimiser

* Rename _optimisation to base_optimiser

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Brady Planden <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

2 participants