Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Still problems with the docu #641

Closed
MaAl13 opened this issue Jun 13, 2022 · 7 comments
Closed

Still problems with the docu #641

MaAl13 opened this issue Jun 13, 2022 · 7 comments

Comments

@MaAl13
Copy link

MaAl13 commented Jun 13, 2022

Hello again,

this question is related to one already posted by me. Since the old topic was closed but the error changed and i am not sure if somebody will see it then, i created this one. Also, it seems to be that i am not the only one with this problem. So initially i have been running the code from https://sensitivity.sciml.ai/dev/ode_fitting/optimization_ode/. This created an error that AutoZygote is not found. By including DiffEqSensitivity and Zygote, the error now changes as you can see below. I know that Chris has this probably on his radar and that he is working on it, but wanted to post it here anyways.

using DifferentialEquations, Optimization, OptimizationPolyalgorithms, OptimizationOptimJL, Plots
using Zygote
using DiffEqSensitivity
function lotka_volterra!(du, u, p, t)
  x, y = u
  α, β, δ, γ = p
  du[1] = dx = α*x - β*x*y
  du[2] = dy = -δ*y + γ*x*y
end

# Initial condition
u0 = [1.0, 1.0]

# Simulation interval and intermediary points
tspan = (0.0, 10.0)
tsteps = 0.0:0.1:10.0

# LV equation parameter. p = [α, β, δ, γ]
p = [1.5, 1.0, 3.0, 1.0]

# Setup the ODE problem, then solve
prob = ODEProblem(lotka_volterra!, u0, tspan, p)
sol = solve(prob, Tsit5())

# Plot the solution
using Plots
plot(sol)
savefig("LV_ode.png")

function loss(p)
  sol = solve(prob, Tsit5(), p=p, saveat = tsteps)
  loss = sum(abs2, sol.-1)
  return loss, sol
end

callback = function (p, l, pred)
  display(l)
  plt = plot(pred, ylim = (0, 6))
  display(plt)
  # Tell Optimization.solve to not halt the optimization. If return true, then
  # optimization stops.
  return false
end

adtype = Optimization.AutoZygote()
optf = Optimization.OptimizationFunction((x,p)->loss(x), adtype)
optprob = Optimization.OptimizationProblem(optf, p)

result_ode = Optimization.solve(optprob, PolyOpt(),
                                    cb = callback,
                                    maxiters = 100)
# result_ode = Optimization.solve(optprob, ADAM(0.1), cb = callback)

ERROR: LoadError: MethodError: no method matching Optim.Options(; extended_trace=true, cb=var"#1#2"(), callback=OptimizationOptimJL.var"#_cb#11"{OptimizationOptimJL.var"#10#18", BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}, Base.Iterators.Cycle{Tuple{Optimization.NullData}}}(OptimizationOptimJL.var"#10#18"(), BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}(LineSearches.InitialStatic{Float64}
alpha: Float64 1.0
scaled: Bool false
, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}
delta: Float64 0.1
sigma: Float64 0.9
alphamax: Float64 Inf
rho: Float64 5.0
epsilon: Float64 1.0e-6
gamma: Float64 0.66
linesearchmax: Int64 50
psi3: Float64 0.1
display: Int64 0
mayterminate: Base.RefValue{Bool}
, nothing, 0.01, Flat()), Base.Iterators.Cycle{Tuple{Optimization.NullData}}((Optimization.NullData(),)), Core.Box(#undef), Core.Box(Optimization.NullData()), Core.Box(2)), iterations=100)
Closest candidates are:
Optim.Options(; x_tol, f_tol, g_tol, x_abstol, x_reltol, f_abstol, f_reltol, g_abstol, g_reltol, outer_x_tol, outer_f_tol, outer_g_tol, outer_x_abstol, outer_x_reltol, outer_f_abstol, outer_f_reltol, outer_g_abstol, outer_g_reltol, f_calls_limit, g_calls_limit, h_calls_limit, allow_f_increases, allow_outer_f_increases, successive_f_tol, iterations, outer_iterations, store_trace, trace_simplex, show_trace, extended_trace, show_every, callback, time_limit) at ~/.julia/packages/Optim/6Lpjy/src/types.jl:73 got unsupported keyword argument "cb"
Optim.Options(::T, ::T, ::T, ::T, ::T, ::T, ::T, ::T, ::T, ::T, ::T, ::T, ::Int64, ::Int64, ::Int64, ::Bool, ::Bool, ::Int64, ::Int64, ::Int64, ::Bool, ::Bool, ::Bool, ::Bool, ::Int64, ::TCallback, ::Float64) where {T, TCallback} at ~/.julia/packages/Optim/6Lpjy/src/types.jl:44 got unsupported keyword arguments "extended_trace", "cb", "callback", "iterations"
Stacktrace:
[1] kwerr(kw::NamedTuple{(:extended_trace, :cb, :callback, :iterations), Tuple{Bool, var"#1#2", OptimizationOptimJL.var"#_cb#11"{OptimizationOptimJL.var"#10#18", BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}, Base.Iterators.Cycle{Tuple{Optimization.NullData}}}, Int64}}, args::Type)
@ Base ./error.jl:163
[2] __map_optimizer_args(prob::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoZygote, var"#3#4", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}; callback::Function, maxiters::Int64, maxtime::Nothing, abstol::Nothing, reltol::Nothing, kwargs::Base.Pairs{Symbol, var"#1#2", Tuple{Symbol}, NamedTuple{(:cb,), Tuple{var"#1#2"}}})
@ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/fdrJg/src/OptimizationOptimJL.jl:37
[3] ___solve(prob::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoZygote, var"#3#4", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}, data::Base.Iterators.Cycle{Tuple{Optimization.NullData}}; callback::Function, maxiters::Int64, maxtime::Nothing, abstol::Nothing, reltol::Nothing, progress::Bool, kwargs::Base.Pairs{Symbol, var"#1#2", Tuple{Symbol}, NamedTuple{(:cb,), Tuple{var"#1#2"}}})
@ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/fdrJg/src/OptimizationOptimJL.jl:140
[4] __solve(prob::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoZygote, var"#3#4", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}, data::Base.Iterators.Cycle{Tuple{Optimization.NullData}}; kwargs::Base.Pairs{Symbol, Any, Tuple{Symbol, Symbol}, NamedTuple{(:maxiters, :cb), Tuple{Int64, var"#1#2"}}})
@ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/fdrJg/src/OptimizationOptimJL.jl:56
[5] solve(::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoZygote, var"#3#4", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}; kwargs::Base.Pairs{Symbol, Any, Tuple{Symbol, Symbol}, NamedTuple{(:maxiters, :cb), Tuple{Int64, var"#1#2"}}})
@ SciMLBase ~/.julia/packages/SciMLBase/UEAKN/src/solve.jl:56
[6] __solve(::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoZygote, var"#3#4", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::PolyOpt; maxiters::Int64, kwargs::Base.Pairs{Symbol, var"#1#2", Tuple{Symbol}, NamedTuple{(:cb,), Tuple{var"#1#2"}}})
@ OptimizationPolyalgorithms ~/.julia/packages/OptimizationPolyalgorithms/5gDHf/src/OptimizationPolyalgorithms.jl:29
[7] #solve#492
@ ~/.julia/packages/SciMLBase/UEAKN/src/solve.jl:56 [inlined]
[8] top-level scope
@ ~/Documents/julia_GPU_solver/Optimization_Test.jl:49
in expression starting at /Users/malmansto/Documents/julia_GPU_solver/Optimization_Test.jl:49

@ChrisRackauckas
Copy link
Member

Yes sorry about that. I forgot to enable strict doctesting so the doctest "passed" while the example failed 😅 . Should be fixed in #622, along with being appropriately tested.

@ChrisRackauckas
Copy link
Member

Fixed.

@MaAl13
Copy link
Author

MaAl13 commented Jun 20, 2022

Hello Chris, thanks for taking a look at the problem. Unfortunately i still have the same error when running the code. I use Julia 1.7.3 with the following packages:
[fbb218c0] BSON v0.3.5
[052768ef] CUDA v3.11.0
[5ae59095] Colors v0.12.8
[a93c6f00] DataFrames v1.3.4
[aae7a2af] DiffEqFlux v1.49.1
[41bf760c] DiffEqSensitivity v6.79.0
[0c46a032] DifferentialEquations v7.1.0
[31c24e10] Distributions v0.25.62
[587475ba] Flux v0.13.3
[f67ccb44] HDF5 v0.16.10
[872c559c] NNlib v0.8.7
[7f7a1694] Optimization v3.6.1
[36348300] OptimizationOptimJL v0.1.1
[500b13db] OptimizationPolyalgorithms v0.1.0
[91a5bcdd] Plots v1.30.0
[e6cf234a] RandomNumbers v1.5.3
[2913bbd2] StatsBase v0.33.16
[e88e6eb3] Zygote v0.6.40
[37e2e46d] LinearAlgebra
[9a3f8284] Random
[10745b16] Statistics

Maybe it is trivial, but can you tell me what i am doing wrong?

@ChrisRackauckas
Copy link
Member

You ran this one verbatim?

https://sensitivity.sciml.ai/dev/ode_fitting/optimization_ode/

@ChrisRackauckas ChrisRackauckas transferred this issue from SciML/Optimization.jl Jun 20, 2022
@MaAl13
Copy link
Author

MaAl13 commented Jun 20, 2022

Hi Chris,
i assumed that it was the same code that i posted in the problem, but indeed by changing from cb to callback (as it is in the docu now) it is running! Maybe just for the docu, it is missing:
using DiffEqSensitivity
using Zygote
otherwise it throws an error!

Thank you so much for the support!!!

@ChrisRackauckas
Copy link
Member

Maybe just for the docu, it is missing:

Thanks for noticing that. Interesting that the doctests didn't catch that. I'll have to find out how to make that caught by the tests.

@FabiJa
Copy link

FabiJa commented Jun 20, 2022

Maybe just for the docu, it is missing:

Thanks for noticing that. Interesting that the doctests didn't catch that. I'll have to find out how to make that caught by the tests.

For me it seems to work without the additional import, but I have:
[41bf760c] DiffEqSensitivity v6.79.0 https://github.com/SciML/DiffEqSensitivity.jl.git#master
[7f7a1694] Optimization v3.6.1 https://github.com/SciML/Optimization.jl.git#master

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants