Skip to content

Commit

Permalink
Restructure optimisers to enable passing of kwargs (#255)
Browse files Browse the repository at this point in the history
* Enable passing of kwargs to SciPy

* Update checks on bounds

* Add update_options and error for Pints

* Rename opt to optim

* Remove stray comments

* Add BaseSciPyOptimiser and BasePintsOptimiser

* Align optimiser option setting

* Update notebooks with kwargs

* Update scripts with kwargs

* Update notebooks

* Align optimisers with Optimisation as base class

* Update stopping criteria in spm_NelderMead.py

Co-authored-by: Brady Planden <[email protected]>

* Update stopping criteria in spm_adam.py

Co-authored-by: Brady Planden <[email protected]>

* Update sigma0 in spm_descent.py

Co-authored-by: Brady Planden <[email protected]>

* Update GradientDescent

* Change update to set and check pints_method

* Update test_optimisation_options

* Update notebooks

* Update set learning rate

* Pop threshold

* Fix bug in model.simulate

* Update notebooks

* Update test_models.py

* Store SciPy result

* Update x0 input and add tests

* Update bounds to avoid x0 outside

* Re-initialise pints_method on certain options

* Update x0_new test

* Update test_optimisation.py

* Create initialise_method for PINTS optimisers

* Align optimisation result

* Update checks on bounds

* Apply suggestions

Co-authored-by: Brady Planden <[email protected]>

* Add standalone optimiser

* Simplify optimiser set-up and align _minimising

* Update option setting in notebooks

* Take abs of cost0

* Implement suggestions from Brady

* Update tests and base option setting

* Update test_invalid_cost

* Increase coverage

* Sort out notebook changes

* Reset scale parameter

* Move settings into arguments

* Update comments

* Update optimiser call

* Move check on jac

* Add assertions

* Add maxiter to test

* Add assertion

* Update to lambda functions

Co-authored-by: Brady Planden <[email protected]>

* Update comment

Co-authored-by: Brady Planden <[email protected]>

* Update to list comprehension

Co-authored-by: Brady Planden <[email protected]>

* Formatting

* Revert "Update to lambda functions"

This reverts commit aa73bff.

* Move minimising out of costs

* Update description

* Updates to #236 to avoid breaking change to `pybop.Optimisation` (#309)

* Splits Optimisation -> BaseOptimiser/Optimisation, enables two optimisation APIs, updts where required, moves _optimisation to optimisers/

* increase coverage

* Pass optimiser_kwargs though run()

* updt examples

* Converts DefaultOptimiser -> Optimisation

* split Optimisation and BaseOptimsier classes, loosen standalone cost unit test

* add incorrect attr test

* fix: updt changelog entry, optimsation_interface notebook, review suggestions

* fix: updt notebook state

* Updt assertions, optimisation object name

---------

Co-authored-by: NicolaCourtier <[email protected]>

* Rename method to pints_optimiser

* Rename base_optimiser to pints_base_optimiser

* Rename _optimisation to base_optimiser

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Brady Planden <[email protected]>
  • Loading branch information
3 people committed Jun 3, 2024
1 parent 3c48f88 commit d6ddfbe
Show file tree
Hide file tree
Showing 47 changed files with 1,572 additions and 767 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
## Features

- [#6](https://github.com/pybop-team/PyBOP/issues/6) - Adds Monte Carlo functionality, with methods based on Pints' algorithms. A base class is added `BaseSampler`, in addition to `PintsBaseSampler`.
- [#236](https://github.com/pybop-team/PyBOP/issues/236) - Restructures the optimiser classes, adds a new optimisation API through direct construction and keyword arguments, and fixes the setting of `max_iterations`, and `_minimising`. Introduces `pybop.BaseOptimiser`, `pybop.BasePintsOptimiser`, and `pybop.BaseSciPyOptimiser` classes.
- [#321](https://github.com/pybop-team/PyBOP/pull/321) - Updates Prior classes with BaseClass, adds a `problem.sample_initial_conditions` method to improve stability of SciPy.Minimize optimiser.
- [#249](https://github.com/pybop-team/PyBOP/pull/249) - Add WeppnerHuggins model and GITT example.
- [#304](https://github.com/pybop-team/PyBOP/pull/304) - Decreases the testing suite completion time.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1639,8 +1639,7 @@
}
],
"source": [
"optim = pybop.Optimisation(cost, optimiser=pybop.PSO)\n",
"optim.set_max_unchanged_iterations(iterations=55, threshold=1e-6)\n",
"optim = pybop.PSO(cost, max_unchanged_iterations=55, threshold=1e-6)\n",
"x, final_cost = optim.run()\n",
"print(\"Initial parameters:\", cost.x0)\n",
"print(\"Estimated parameters:\", x)"
Expand Down
3 changes: 1 addition & 2 deletions examples/notebooks/equivalent_circuit_identification.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -417,8 +417,7 @@
}
],
"source": [
"optim = pybop.Optimisation(cost, optimiser=pybop.CMAES)\n",
"optim.set_max_iterations(300)\n",
"optim = pybop.CMAES(cost, max_iterations=300)\n",
"x, final_cost = optim.run()\n",
"print(\"Initial parameters:\", cost.x0)\n",
"print(\"Estimated parameters:\", x)"
Expand Down
6 changes: 3 additions & 3 deletions examples/notebooks/multi_model_identification.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -3813,9 +3813,9 @@
" print(f\"Running {model.name}\")\n",
" problem = pybop.FittingProblem(model, parameters, dataset, init_soc=init_soc)\n",
" cost = pybop.SumSquaredError(problem)\n",
" optim = pybop.Optimisation(cost, optimiser=pybop.XNES, verbose=True)\n",
" optim.set_max_iterations(60)\n",
" optim.set_max_unchanged_iterations(15)\n",
" optim = pybop.XNES(\n",
" cost, verbose=True, max_iterations=60, max_unchanged_iterations=15\n",
" )\n",
" x, final_cost = optim.run()\n",
" optims.append(optim)\n",
" xs.append(x)"
Expand Down
29 changes: 7 additions & 22 deletions examples/notebooks/multi_optimiser_identification.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -358,9 +358,7 @@
"cost = pybop.SumSquaredError(problem)\n",
"for optimiser in gradient_optimisers:\n",
" print(f\"Running {optimiser.__name__}\")\n",
" optim = pybop.Optimisation(cost, optimiser=optimiser)\n",
" optim.set_max_unchanged_iterations(20)\n",
" optim.set_max_iterations(60)\n",
" optim = optimiser(cost, max_unchanged_iterations=20, max_iterations=60)\n",
" x, _ = optim.run()\n",
" optims.append(optim)\n",
" xs.append(x)"
Expand All @@ -387,9 +385,7 @@
"source": [
"for optimiser in non_gradient_optimisers:\n",
" print(f\"Running {optimiser.__name__}\")\n",
" optim = pybop.Optimisation(cost, optimiser=optimiser)\n",
" optim.set_max_unchanged_iterations(20)\n",
" optim.set_max_iterations(60)\n",
" optim = optimiser(cost, max_unchanged_iterations=20, max_iterations=60)\n",
" x, _ = optim.run()\n",
" optims.append(optim)\n",
" xs.append(x)"
Expand All @@ -413,9 +409,7 @@
"source": [
"for optimiser in scipy_optimisers:\n",
" print(f\"Running {optimiser.__name__}\")\n",
" optim = pybop.Optimisation(cost, optimiser=optimiser)\n",
" optim.set_max_unchanged_iterations(20)\n",
" optim.set_max_iterations(60)\n",
" optim = optimiser(cost, max_iterations=60)\n",
" x, _ = optim.run()\n",
" optims.append(optim)\n",
" xs.append(x)"
Expand Down Expand Up @@ -462,14 +456,7 @@
],
"source": [
"for optim in optims:\n",
" if isinstance(\n",
" optim.optimiser, (pybop.SciPyMinimize, pybop.SciPyDifferentialEvolution)\n",
" ):\n",
" print(f\"| Optimiser: {optim.optimiser.name()} | Results: {optim.result.x} |\")\n",
" else:\n",
" print(\n",
" f\"| Optimiser: {optim.optimiser.name()} | Results: {optim.optimiser.x_best()} |\"\n",
" )"
" print(f\"| Optimiser: {optim.name()} | Results: {optim.result.x} |\")"
]
},
{
Expand Down Expand Up @@ -612,9 +599,7 @@
],
"source": [
"for optim, x in zip(optims, xs):\n",
" pybop.quick_plot(\n",
" optim.cost.problem, parameter_values=x, title=optim.optimiser.name()\n",
" )"
" pybop.quick_plot(optim.cost.problem, parameter_values=x, title=optim.name())"
]
},
{
Expand Down Expand Up @@ -822,7 +807,7 @@
],
"source": [
"for optim in optims:\n",
" pybop.plot_convergence(optim, title=optim.optimiser.name())\n",
" pybop.plot_convergence(optim, title=optim.name())\n",
" pybop.plot_parameters(optim)"
]
},
Expand Down Expand Up @@ -942,7 +927,7 @@
"# Plot the cost landscape with optimisation path and updated bounds\n",
"bounds = np.array([[0.5, 0.8], [0.55, 0.8]])\n",
"for optim in optims:\n",
" pybop.plot2d(optim, bounds=bounds, steps=10, title=optim.optimiser.name())"
" pybop.plot2d(optim, bounds=bounds, steps=10, title=optim.name())"
]
},
{
Expand Down
10 changes: 4 additions & 6 deletions examples/notebooks/optimiser_calibration.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -281,8 +281,7 @@
"source": [
"problem = pybop.FittingProblem(model, parameters, dataset)\n",
"cost = pybop.SumSquaredError(problem)\n",
"optim = pybop.Optimisation(cost, optimiser=pybop.GradientDescent, sigma0=0.2)\n",
"optim.set_max_iterations(100)"
"optim = pybop.GradientDescent(cost, sigma0=0.2, max_iterations=100)"
]
},
{
Expand Down Expand Up @@ -456,8 +455,7 @@
" print(sigma)\n",
" problem = pybop.FittingProblem(model, parameters, dataset)\n",
" cost = pybop.SumSquaredError(problem)\n",
" optim = pybop.Optimisation(cost, optimiser=pybop.GradientDescent, sigma0=sigma)\n",
" optim.set_max_iterations(100)\n",
" optim = pybop.GradientDescent(cost, sigma0=sigma, max_iterations=100)\n",
" x, final_cost = optim.run()\n",
" optims.append(optim)\n",
" xs.append(x)"
Expand Down Expand Up @@ -490,7 +488,7 @@
"source": [
"for optim, sigma in zip(optims, sigmas):\n",
" print(\n",
" f\"| Sigma: {sigma} | Num Iterations: {optim._iterations} | Best Cost: {optim.optimiser.f_best()} | Results: {optim.optimiser.x_best()} |\"\n",
" f\"| Sigma: {sigma} | Num Iterations: {optim._iterations} | Best Cost: {optim.pints_optimiser.f_best()} | Results: {optim.pints_optimiser.x_best()} |\"\n",
" )"
]
},
Expand Down Expand Up @@ -723,7 +721,7 @@
}
],
"source": [
"optim = pybop.Optimisation(cost, optimiser=pybop.GradientDescent, sigma0=0.0115)\n",
"optim = pybop.GradientDescent(cost, sigma0=0.0115)\n",
"x, final_cost = optim.run()\n",
"pybop.quick_plot(problem, parameter_values=x, title=\"Optimised Comparison\");"
]
Expand Down
Loading

0 comments on commit d6ddfbe

Please sign in to comment.