-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
using pybamm variables to implement cost functions #513
Comments
This is an interesting proposal - it seems to offer a number of improvements. A couple of questions come to mind:
Overall this sounds like a sensible direction. I'm keen to make sure we can still use the Jax-based solvers, but I don't think this would interfere with that. Do you think a small toy example could be spun up for us to discuss around? |
by "time-series reference" do you mean the data? My first thought would be that it would be held by the pybamm.DiscreteSumOverTime node in the expression tree and would be converted into a casadi matrix to be embedded within the final casadi function (in much the same way as we treat discretisation matrices in the expressions) yea, this would mean interacting much more with the pybamm model class, so there would be increased error handling due to that
I don't think so, you can add any number of variables to a pybamm model. Then the
The transformation classes transform the input parameters from the optimizer before you simulate the model right? If so then I don't think this would affect the transformation classes at all (this would be downstream of the transformation)
not difficult, just slow. pybamm only has forward sensitivity analysis, so we'd still need to add backwards adjoint features, and then hessian calculation. Sundials has all this however, so its just a matter of exposing the functionality. Its slow going, but its on the roadmap
additional variables are output variables right? I'm talking about input paramers here. So in the example @NicolaCourtier mentioned above this would be the mass of each component (I believe).
yea, I think this is a good idea. I'll put something together |
Thanks for the detailed response, that helps clarify.
Yep, the reference provided by the user. At the moment we construct this within the
This sounds reasonable. In this situation the meta-cost classes would still be child classes of
Yup, duh. Don't mind me over here.
Right, ok, so is the idea here is that these parameters are exposed at the right level (within the expression tree), so they don't need to be funnelled up to the cost classes in their current definition. That's an understandable improvement.
Cool, I'm looking forward to discussing in more depth! One area that I think we will need to work on is ensuring the testing suite on the PyBaMM side covers this functionality (assuming they are happy with these additions) as I suspect small changes in the solver / expression tree codebase could break functionality on our side without it effecting the majority of users of PyBaMM. Just to reiterate, overall this sounds promising though. |
FYI I put together a toy example just in pybamm. The general idea is that cost functions just add new variables and optionally parameters to the original model import pybamm
model = pybamm.lithium_ion.SPMe()
parameters = ["Positive electrode thickness [m]", "Positive particle radius [m]", "Cell mass [kg]"]
experiment = pybamm.Experiment(
["Discharge at 1C until 2.5 V (5 seconds period)"],
)
# add cell mass to the model
parameter_set = model.default_parameter_values
parameter_set.update({"Cell mass [kg]": 0.1}, check_already_exists=False)
# add gravimetrical energy density cost function to the model
cell_mass = pybamm.Parameter("Cell mass [kg]")
voltage = model.variables["Voltage [V]"]
current = model.variables["Current [A]"]
model.variables["Gravimetrical energy density [Wh kg-1]"] = pybamm.ExplicitTimeIntegral(current * voltage / 3600 / cell_mass, pybamm.Scalar(0.0))
# define cost function
def cost_function(args):
new_parameters = parameter_set.copy()
for i, parameter in enumerate(parameters):
new_parameters[parameter] = args[i]
sim = pybamm.Simulation(model, parameter_values=new_parameters, experiment=experiment)
sol = sim.solve()
return sol["Gravimetrical energy density [Wh kg-1]"](sol.t[-1])
print('eval', cost_function([8.27568156e-05, 2.07975193e-06, 1.0])) |
I note that there is also a proposal to use jax functions to calculate cost functions (#481). |
Feature description
PyBaMM uses model variables to do post-processing calculations on solved models, e.g. calculating the voltage from the internal state vector returned by the solver. This is quite similar in concept to the cost functions implemented in PyBop: "take the output of a simulation and calculate some function from this", so I would suggest that we can simply use PyBaMM variables to implement each cost function.
Motivation
This was inspired by this problem suggested by @NicolaCourtier:
The advantage of using PyBaMM's variables are:
output_variables
argument the calculation of the cost function can be moved into the solver itselfsol["my_cost_function"].data
Possible implementation
This would require some additions on the pybamm side. The
pybamm.ExplicitTimeIntegral
allows us to implement some of the design cost functions, but you would need something likepybamm.DiscreteSumOverTime
to implement the fitting cost functions. You would also need to support the calculation of sensitivities for these symbol classes.Additional context
No response
The text was updated successfully, but these errors were encountered: