-
-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when running Code in documentation #286
Comments
Missing a |
Thanks for the fast response! ERROR: LoadError: MethodError: no method matching Optim.Options(; extended_trace=true, cb=var"#1#2"(), callback=OptimizationOptimJL.var"#_cb#11"{OptimizationOptimJL.var"#10#18", BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}, Base.Iterators.Cycle{Tuple{Optimization.NullData}}}(OptimizationOptimJL.var"#10#18"(), BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}(LineSearches.InitialStatic{Float64} |
I have the same problem. |
Sorry about that. I needed to enable strict mode so it would properly throw errors when it's out of date. I have a PR open to do this which should be merged later today: SciML/SciMLSensitivity.jl#622 |
Let me know if there's anything else. Should be all good now. |
Hi, i tried to run the Optimizer now for the Yeast Glysolisis model. Without the PositiveDomain callback i run into a domain error. With the PositiveDomain callback i get the following error for the jvp. Can you tell me how to fix this?
ERROR: LoadError: TypeError: in typeassert, expected Float64, got a value of type ForwardDiff.Dual{Nothing, Float64, 7} |
This seems very unrelated. Can you open a new issue on SciMLSensitivity.jl, and narrow this down to just an issue on the derivative calculation? I don't think you need Optimization.jl at all. If I had to guess at the issue, it might be that PositiveDomain might need caches that can go dual, but narrowing down the issue is what would help solve it. BTW, with the PositiveDomain(), are you sure this is a good use of it? I have you double checked that the parameters should give a stable solution for the outer sides of the optimization? At what parameters is it necessary? |
The PositiveDomain() was just a try to catch the following error. If i don't use the PositiveDomain() call and comment out the forced positvity of the ODE i get: ERROR: LoadError: DomainError with -0.028633453239182774: The Optimization takes a few steps before this error comes. Essentially the gradient is computed correctly for some parameters, until a parameter vector comes that produces a complex solution trajectory. Can the Optimization package somehow handle this by punishing complex trajectories and ignoring the error message of the ODE solver? |
Please open an issue in SciMLSensitivity and I'll get to it, but putting it into here will only make it get lost |
Hello, i am trying to run the example in https://sensitivity.sciml.ai/dev/ode_fitting/optimization_ode/, the code is the following:
This produces the error:
ERROR: LoadError: UndefVarError: AutoZygote not defined
Stacktrace:
[1] getproperty(x::Module, f::Symbol)
@ Base ./Base.jl:35
[2] top-level scope
@ ~/Documents/julia_GPU_solver/Optimization_Test.jl:44
in expression starting at /Users/malmansto/Documents/julia_GPU_solver/Optimization_Test.jl:44
Changing AutoZygoye() to AutoForwardDiff() results in
ERROR: LoadError: MethodError: no method matching Optim.Options(; extended_trace=true, cb=var"#1#2"(), callback=OptimizationOptimJL.var"#_cb#11"{OptimizationOptimJL.var"#10#18", BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}, Base.Iterators.Cycle{Tuple{Optimization.NullData}}}(OptimizationOptimJL.var"#10#18"(), BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}(LineSearches.InitialStatic{Float64}
alpha: Float64 1.0
scaled: Bool false
, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}
delta: Float64 0.1
sigma: Float64 0.9
alphamax: Float64 Inf
rho: Float64 5.0
epsilon: Float64 1.0e-6
gamma: Float64 0.66
linesearchmax: Int64 50
psi3: Float64 0.1
display: Int64 0
mayterminate: Base.RefValue{Bool}
, nothing, 0.01, Flat()), Base.Iterators.Cycle{Tuple{Optimization.NullData}}((Optimization.NullData(),)), Core.Box(#undef), Core.Box(Optimization.NullData()), Core.Box(2)), iterations=100)
Closest candidates are:
Optim.Options(; x_tol, f_tol, g_tol, x_abstol, x_reltol, f_abstol, f_reltol, g_abstol, g_reltol, outer_x_tol, outer_f_tol, outer_g_tol, outer_x_abstol, outer_x_reltol, outer_f_abstol, outer_f_reltol, outer_g_abstol, outer_g_reltol, f_calls_limit, g_calls_limit, h_calls_limit, allow_f_increases, allow_outer_f_increases, successive_f_tol, iterations, outer_iterations, store_trace, trace_simplex, show_trace, extended_trace, show_every, callback, time_limit) at ~/.julia/packages/Optim/6Lpjy/src/types.jl:73 got unsupported keyword argument "cb"
Optim.Options(::T, ::T, ::T, ::T, ::T, ::T, ::T, ::T, ::T, ::T, ::T, ::T, ::Int64, ::Int64, ::Int64, ::Bool, ::Bool, ::Int64, ::Int64, ::Int64, ::Bool, ::Bool, ::Bool, ::Bool, ::Int64, ::TCallback, ::Float64) where {T, TCallback} at ~/.julia/packages/Optim/6Lpjy/src/types.jl:44 got unsupported keyword arguments "extended_trace", "cb", "callback", "iterations"
Stacktrace:
[1] kwerr(kw::NamedTuple{(:extended_trace, :cb, :callback, :iterations), Tuple{Bool, var"#1#2", OptimizationOptimJL.var"#_cb#11"{OptimizationOptimJL.var"#10#18", BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}, Base.Iterators.Cycle{Tuple{Optimization.NullData}}}, Int64}}, args::Type)
@ Base ./error.jl:163
[2] __map_optimizer_args(prob::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, var"#3#4", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}; callback::Function, maxiters::Int64, maxtime::Nothing, abstol::Nothing, reltol::Nothing, kwargs::Base.Pairs{Symbol, var"#1#2", Tuple{Symbol}, NamedTuple{(:cb,), Tuple{var"#1#2"}}})
@ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/fdrJg/src/OptimizationOptimJL.jl:37
[3] ___solve(prob::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, var"#3#4", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}, data::Base.Iterators.Cycle{Tuple{Optimization.NullData}}; callback::Function, maxiters::Int64, maxtime::Nothing, abstol::Nothing, reltol::Nothing, progress::Bool, kwargs::Base.Pairs{Symbol, var"#1#2", Tuple{Symbol}, NamedTuple{(:cb,), Tuple{var"#1#2"}}})
@ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/fdrJg/src/OptimizationOptimJL.jl:140
[4] __solve(prob::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, var"#3#4", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}, data::Base.Iterators.Cycle{Tuple{Optimization.NullData}}; kwargs::Base.Pairs{Symbol, Any, Tuple{Symbol, Symbol}, NamedTuple{(:maxiters, :cb), Tuple{Int64, var"#1#2"}}})
@ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/fdrJg/src/OptimizationOptimJL.jl:56
[5] solve(::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, var"#3#4", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::BFGS{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Nothing, Float64, Flat}; kwargs::Base.Pairs{Symbol, Any, Tuple{Symbol, Symbol}, NamedTuple{(:maxiters, :cb), Tuple{Int64, var"#1#2"}}})
@ SciMLBase ~/.julia/packages/SciMLBase/UEAKN/src/solve.jl:56
[6] __solve(::OptimizationProblem{true, OptimizationFunction{true, Optimization.AutoForwardDiff{nothing}, var"#3#4", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::PolyOpt; maxiters::Int64, kwargs::Base.Pairs{Symbol, var"#1#2", Tuple{Symbol}, NamedTuple{(:cb,), Tuple{var"#1#2"}}})
@ OptimizationPolyalgorithms ~/.julia/packages/OptimizationPolyalgorithms/5gDHf/src/OptimizationPolyalgorithms.jl:29
[7] #solve#492
@ ~/.julia/packages/SciMLBase/UEAKN/src/solve.jl:56 [inlined]
[8] top-level scope
@ ~/Documents/julia_GPU_solver/Optimization_Test.jl:48
in expression starting at /Users/malmansto/Documents/julia_GPU_solver/Optimization_Test.jl:48
The text was updated successfully, but these errors were encountered: