-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pass kwargs to SciPy optimisers #236
Labels
enhancement
New feature or request
Comments
14 tasks
15 tasks
NicolaCourtier
added a commit
that referenced
this issue
May 22, 2024
* Splits Optimisation -> BaseOptimiser/Optimisation, enables two optimisation APIs, updts where required, moves _optimisation to optimisers/ * increase coverage * Pass optimiser_kwargs though run() * updt examples * Converts DefaultOptimiser -> Optimisation * split Optimisation and BaseOptimsier classes, loosen standalone cost unit test * add incorrect attr test * fix: updt changelog entry, optimsation_interface notebook, review suggestions * fix: updt notebook state * Updt assertions, optimisation object name --------- Co-authored-by: NicolaCourtier <[email protected]>
NicolaCourtier
added a commit
that referenced
this issue
May 22, 2024
* Enable passing of kwargs to SciPy * Update checks on bounds * Add update_options and error for Pints * Rename opt to optim * Remove stray comments * Add BaseSciPyOptimiser and BasePintsOptimiser * Align optimiser option setting * Update notebooks with kwargs * Update scripts with kwargs * Update notebooks * Align optimisers with Optimisation as base class * Update stopping criteria in spm_NelderMead.py Co-authored-by: Brady Planden <[email protected]> * Update stopping criteria in spm_adam.py Co-authored-by: Brady Planden <[email protected]> * Update sigma0 in spm_descent.py Co-authored-by: Brady Planden <[email protected]> * Update GradientDescent * Change update to set and check pints_method * Update test_optimisation_options * Update notebooks * Update set learning rate * Pop threshold * Fix bug in model.simulate * Update notebooks * Update test_models.py * Store SciPy result * Update x0 input and add tests * Update bounds to avoid x0 outside * Re-initialise pints_method on certain options * Update x0_new test * Update test_optimisation.py * Create initialise_method for PINTS optimisers * Align optimisation result * Update checks on bounds * Apply suggestions Co-authored-by: Brady Planden <[email protected]> * Add standalone optimiser * Simplify optimiser set-up and align _minimising * Update option setting in notebooks * Take abs of cost0 * Implement suggestions from Brady * Update tests and base option setting * Update test_invalid_cost * Increase coverage * Sort out notebook changes * Reset scale parameter * Move settings into arguments * Update comments * Update optimiser call * Move check on jac * Add assertions * Add maxiter to test * Add assertion * Update to lambda functions Co-authored-by: Brady Planden <[email protected]> * Update comment Co-authored-by: Brady Planden <[email protected]> * Update to list comprehension Co-authored-by: Brady Planden <[email protected]> * Formatting * Revert "Update to lambda functions" This reverts commit aa73bff. * Move minimising out of costs * Update description * Updates to #236 to avoid breaking change to `pybop.Optimisation` (#309) * Splits Optimisation -> BaseOptimiser/Optimisation, enables two optimisation APIs, updts where required, moves _optimisation to optimisers/ * increase coverage * Pass optimiser_kwargs though run() * updt examples * Converts DefaultOptimiser -> Optimisation * split Optimisation and BaseOptimsier classes, loosen standalone cost unit test * add incorrect attr test * fix: updt changelog entry, optimsation_interface notebook, review suggestions * fix: updt notebook state * Updt assertions, optimisation object name --------- Co-authored-by: NicolaCourtier <[email protected]> * Rename method to pints_optimiser * Rename base_optimiser to pints_base_optimiser * Rename _optimisation to base_optimiser --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Brady Planden <[email protected]>
BradyPlanden
added a commit
that referenced
this issue
Jun 3, 2024
* Enable passing of kwargs to SciPy * Update checks on bounds * Add update_options and error for Pints * Rename opt to optim * Remove stray comments * Add BaseSciPyOptimiser and BasePintsOptimiser * Align optimiser option setting * Update notebooks with kwargs * Update scripts with kwargs * Update notebooks * Align optimisers with Optimisation as base class * Update stopping criteria in spm_NelderMead.py Co-authored-by: Brady Planden <[email protected]> * Update stopping criteria in spm_adam.py Co-authored-by: Brady Planden <[email protected]> * Update sigma0 in spm_descent.py Co-authored-by: Brady Planden <[email protected]> * Update GradientDescent * Change update to set and check pints_method * Update test_optimisation_options * Update notebooks * Update set learning rate * Pop threshold * Fix bug in model.simulate * Update notebooks * Update test_models.py * Store SciPy result * Update x0 input and add tests * Update bounds to avoid x0 outside * Re-initialise pints_method on certain options * Update x0_new test * Update test_optimisation.py * Create initialise_method for PINTS optimisers * Align optimisation result * Update checks on bounds * Apply suggestions Co-authored-by: Brady Planden <[email protected]> * Add standalone optimiser * Simplify optimiser set-up and align _minimising * Update option setting in notebooks * Take abs of cost0 * Implement suggestions from Brady * Update tests and base option setting * Update test_invalid_cost * Increase coverage * Sort out notebook changes * Reset scale parameter * Move settings into arguments * Update comments * Update optimiser call * Move check on jac * Add assertions * Add maxiter to test * Add assertion * Update to lambda functions Co-authored-by: Brady Planden <[email protected]> * Update comment Co-authored-by: Brady Planden <[email protected]> * Update to list comprehension Co-authored-by: Brady Planden <[email protected]> * Formatting * Revert "Update to lambda functions" This reverts commit aa73bff. * Move minimising out of costs * Update description * Updates to #236 to avoid breaking change to `pybop.Optimisation` (#309) * Splits Optimisation -> BaseOptimiser/Optimisation, enables two optimisation APIs, updts where required, moves _optimisation to optimisers/ * increase coverage * Pass optimiser_kwargs though run() * updt examples * Converts DefaultOptimiser -> Optimisation * split Optimisation and BaseOptimsier classes, loosen standalone cost unit test * add incorrect attr test * fix: updt changelog entry, optimsation_interface notebook, review suggestions * fix: updt notebook state * Updt assertions, optimisation object name --------- Co-authored-by: NicolaCourtier <[email protected]> * Rename method to pints_optimiser * Rename base_optimiser to pints_base_optimiser * Rename _optimisation to base_optimiser --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Brady Planden <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Feature description
At the moment, the SciPy optimisers aren't passed kwargs. As such, hyper parameter tuning is not available to the end user. This issue is to update the SciPy implementation to enable optimiser method, options, etc. changes via kwargs.
Motivation
End users should be able to modify the SciPy optimisers, links to #195.
Possible implementation
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: