Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DimensionMismatch when adding a function to the neural network #526

Open
xk-y opened this issue Apr 12, 2021 · 18 comments
Open

DimensionMismatch when adding a function to the neural network #526

xk-y opened this issue Apr 12, 2021 · 18 comments

Comments

@xk-y
Copy link

xk-y commented Apr 12, 2021

Hi, I got an error when trying to add a function to the chain, and was wondering if there is any way to solve it. The code is following:

using DiffEqFlux, OrdinaryDiffEq, Flux
#Prepare true data
u0 = [2.;0.]
datasize = 30
tspan = (0.0f0,1.5f0)
function trueODEfunc(du,u,p,t)
    true_A = [-0.1 2.0; -2.0 -0.1]
    du .= ((u.^3)'true_A)'
end
t = range(tspan[1],tspan[2],length=datasize)
prob = ODEProblem(trueODEfunc,u0,tspan)
ode_data = Array(solve(prob,Tsit5(),saveat=t))

#build neural ode model
struct S
    a
    b
end
function (s::S)(x) 
    transpose(transpose(s.a) * s.b) * x
end
S(m::Integer,n::Integer, init=Flux.glorot_uniform) =  S(init(n,m),ones(n, n))
dudt = Chain(
             Dense(2,30,tanh),
             Dense(30,10),
             S(10,2)
             )
n_ode = NeuralODE(dudt,tspan,Tsit5(),saveat=t)

#prepare for the training
ps = Flux.params(dudt)
function predict_n_ode()
  Array(n_ode(u0))
end
loss_n_ode() = Flux.mse(ode_data,predict_n_ode())
loss_n_ode()#I got 2.5051380830609693
data = Iterators.repeated((), 2)
opt = ADAM(0.1)

#train the model
Flux.train!(loss_n_ode, ps, data, opt)
DimensionMismatch("array could not be broadcast to match destination")

Stacktrace:
 [1] check_broadcast_shape at .\broadcast.jl:520 [inlined]
 [2] check_broadcast_axes at .\broadcast.jl:523 [inlined]
 [3] instantiate at .\broadcast.jl:269 [inlined]
 [4] materialize! at .\broadcast.jl:848 [inlined]
 [5] materialize!(::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}, ::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1},Nothing,typeof(identity),Tuple{Array{Float64,1}}}) at .\broadcast.jl:845
 [6] _vecjacobian!(::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}, ::Array{Float64,1}, ::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}, ::Array{Float32,1}, ::Float32, ::DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}}, ::DiffEqSensitivity.ZygoteVJP, ::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}, ::Nothing) at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\derivative_wrappers.jl:295
 [7] _vecjacobian! at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\derivative_wrappers.jl:194 [inlined]
 [8] #vecjacobian!#20 at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\derivative_wrappers.jl:147 [inlined]
 [9] (::DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}})(::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float32,1}, ::Float32) at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\interpolating_adjoint.jl:90
 [10] ODEFunction at C:\Users\Administrator\.julia\packages\SciMLBase\10xNC\src\scimlfunctions.jl:334 [inlined]
 [11] initialize!(::OrdinaryDiffEq.ODEIntegrator{Tsit5,true,Array{Float64,1},Nothing,Float32,Array{Float32,1},Float32,Float64,Float32,Array{Array{Float64,1},1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},true,Array{Float32,1},ODEFunction{true,DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}},LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Symbol,CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}},Tuple{Symbol},NamedTuple{(:callback,),Tuple{CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}}}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{true,DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}},LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5Cache{Array{Float64,1},Array{Float64,1},Array{Float64,1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}}},DiffEqBase.DEStats},ODEFunction{true,DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}},LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},OrdinaryDiffEq.Tsit5Cache{Array{Float64,1},Array{Float64,1},Array{Float64,1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},OrdinaryDiffEq.DEOptions{Float64,Float64,Float64,Float32,typeof(DiffEqBase.ODE_DEFAULT_NORM),typeof(LinearAlgebra.opnorm),Nothing,CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}},typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN),typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE),typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK),DataStructures.BinaryHeap{Float32,Base.Order.ForwardOrdering},DataStructures.BinaryHeap{Float32,Base.Order.ForwardOrdering},Nothing,Nothing,Int64,Array{Float32,1},Array{Float64,1},Tuple{}},Array{Float64,1},Float64,Nothing,OrdinaryDiffEq.DefaultInit}, ::OrdinaryDiffEq.Tsit5Cache{Array{Float64,1},Array{Float64,1},Array{Float64,1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}}) at C:\Users\Administrator\.julia\packages\OrdinaryDiffEq\5egkj\src\perform_step\low_order_rk_perform_step.jl:623
 [12] __init(::ODEProblem{Array{Float64,1},Tuple{Float32,Float32},true,Array{Float32,1},ODEFunction{true,DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}},LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Symbol,CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}},Tuple{Symbol},NamedTuple{(:callback,),Tuple{CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}}}}},SciMLBase.StandardODEProblem}, ::Tsit5, ::Tuple{}, ::Tuple{}, ::Tuple{}, ::Type{Val{true}}; saveat::Array{Float64,1}, tstops::Array{Float32,1}, d_discontinuities::Tuple{}, save_idxs::Nothing, save_everystep::Bool, save_on::Bool, save_start::Bool, save_end::Nothing, callback::CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}}, dense::Bool, calck::Bool, dt::Float32, dtmin::Nothing, dtmax::Float32, force_dtmin::Bool, adaptive::Bool, gamma::Rational{Int64}, abstol::Float64, reltol::Float64, qmin::Rational{Int64}, qmax::Int64, qsteady_min::Int64, qsteady_max::Int64, qoldinit::Rational{Int64}, fullnormalize::Bool, failfactor::Int64, beta1::Nothing, beta2::Nothing, maxiters::Int64, internalnorm::typeof(DiffEqBase.ODE_DEFAULT_NORM), internalopnorm::typeof(LinearAlgebra.opnorm), isoutofdomain::typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN), unstable_check::typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK), verbose::Bool, timeseries_errors::Bool, dense_errors::Bool, advance_to_tstop::Bool, stop_at_next_tstop::Bool, initialize_save::Bool, progress::Bool, progress_steps::Int64, progress_name::String, progress_message::typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE), userdata::Nothing, allow_extrapolation::Bool, initialize_integrator::Bool, alias_u0::Bool, alias_du0::Bool, initializealg::OrdinaryDiffEq.DefaultInit, kwargs::Base.Iterators.Pairs{Symbol,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool},Tuple{Symbol},NamedTuple{(:sense,),Tuple{DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}}}) at C:\Users\Administrator\.julia\packages\OrdinaryDiffEq\5egkj\src\solve.jl:433
 [13] #__solve#404 at C:\Users\Administrator\.julia\packages\OrdinaryDiffEq\5egkj\src\solve.jl:4 [inlined]
 [14] solve_call(::ODEProblem{Array{Float64,1},Tuple{Float32,Float32},true,Array{Float32,1},ODEFunction{true,DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}},LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Symbol,CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}},Tuple{Symbol},NamedTuple{(:callback,),Tuple{CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}}}}},SciMLBase.StandardODEProblem}, ::Tsit5; merge_callbacks::Bool, kwargs::Base.Iterators.Pairs{Symbol,Any,NTuple{7,Symbol},NamedTuple{(:save_everystep, :save_start, :saveat, :tstops, :abstol, :reltol, :sense),Tuple{Bool,Bool,Array{Float64,1},Array{Float32,1},Float64,Float64,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}}}) at C:\Users\Administrator\.julia\packages\DiffEqBase\rN9Px\src\solve.jl:61
 [15] #solve_up#58 at C:\Users\Administrator\.julia\packages\DiffEqBase\rN9Px\src\solve.jl:82 [inlined]
 [16] #solve#57 at C:\Users\Administrator\.julia\packages\DiffEqBase\rN9Px\src\solve.jl:70 [inlined]
 [17] _adjoint_sensitivities(::ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats}, ::DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool}, ::Tsit5, ::DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon}, ::Array{Float32,1}, ::Nothing; abstol::Float64, reltol::Float64, checkpoints::Array{Float32,1}, corfunc_analytical::Nothing, callback::Nothing, kwargs::Base.Iterators.Pairs{Symbol,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool},Tuple{Symbol},NamedTuple{(:sense,),Tuple{DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}}}) at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\sensitivity_interface.jl:28
 [18] adjoint_sensitivities(::ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats}, ::Tsit5, ::Vararg{Any,N} where N; sensealg::DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool}, kwargs::Base.Iterators.Pairs{Symbol,Union{Nothing, DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}},Tuple{Symbol,Symbol},NamedTuple{(:callback, :sense),Tuple{Nothing,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}}}) at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\sensitivity_interface.jl:6
 [19] (::DiffEqSensitivity.var"#adjoint_sensitivity_backpass#145"{Base.Iterators.Pairs{Symbol,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool},Tuple{Symbol},NamedTuple{(:sense,),Tuple{DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}}},Tsit5,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},Array{Float32,1},Tuple{},NamedTuple{(:sense,),Tuple{DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}},Colon})(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\concrete_solve.jl:179
 [20] #263#back at C:\Users\Administrator\.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:65 [inlined]
 [21] #178 at C:\Users\Administrator\.julia\packages\Zygote\lwmfx\src\lib\lib.jl:194 [inlined]
 [22] (::Zygote.var"#1698#back#180"{Zygote.var"#178#179"{DiffEqBase.var"#263#back#74"{DiffEqSensitivity.var"#adjoint_sensitivity_backpass#145"{Base.Iterators.Pairs{Symbol,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool},Tuple{Symbol},NamedTuple{(:sense,),Tuple{DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}}},Tsit5,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},Array{Float32,1},Tuple{},NamedTuple{(:sense,),Tuple{DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}},Colon}},Tuple{NTuple{6,Nothing},Tuple{Nothing}}}})(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:59
 [23] #solve#57 at C:\Users\Administrator\.julia\packages\DiffEqBase\rN9Px\src\solve.jl:70 [inlined]
 [24] (::typeof((#solve#57)))(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\lwmfx\src\compiler\interface2.jl:0
 [25] (::Zygote.var"#178#179"{typeof((#solve#57)),Tuple{NTuple{6,Nothing},Tuple{Nothing}}})(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\lwmfx\src\lib\lib.jl:194
 [26] (::Zygote.var"#1698#back#180"{Zygote.var"#178#179"{typeof((#solve#57)),Tuple{NTuple{6,Nothing},Tuple{Nothing}}}})(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:59
 [27] (::typeof((solve##kw)))(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\lwmfx\src\compiler\interface2.jl:0
 [28] (::Zygote.var"#178#179"{typeof((solve##kw)),Tuple{Tuple{Nothing,Nothing,Nothing},Tuple{Nothing}}})(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\lwmfx\src\lib\lib.jl:194
 [29] #1698#back at C:\Users\Administrator\.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:59 [inlined]
 [30] NeuralODE at C:\Users\Administrator\.julia\packages\DiffEqFlux\dYHZU\src\neural_de.jl:69 [inlined]
 [31] (::typeof((λ)))(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\lwmfx\src\compiler\interface2.jl:0
 [32] NeuralODE at C:\Users\Administrator\.julia\packages\DiffEqFlux\dYHZU\src\neural_de.jl:65 [inlined]
 [33] (::typeof((λ)))(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\lwmfx\src\compiler\interface2.jl:0
 [34] predict_n_ode at .\In[4]:3 [inlined]
 [35] (::typeof((predict_n_ode)))(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\lwmfx\src\compiler\interface2.jl:0
 [36] loss_n_ode at .\In[4]:5 [inlined]
 [37] (::typeof((loss_n_ode)))(::Float64) at C:\Users\Administrator\.julia\packages\Zygote\lwmfx\src\compiler\interface2.jl:0
 [38] #178 at C:\Users\Administrator\.julia\packages\Zygote\lwmfx\src\lib\lib.jl:194 [inlined]
 [39] #1698#back at C:\Users\Administrator\.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:59 [inlined]
 [40] #39 at C:\Users\Administrator\.julia\packages\Flux\lwA90\src\optimise\train.jl:102 [inlined]
 [41] (::Zygote.var"#69#70"{Zygote.Params,Zygote.Context,typeof((#39))})(::Float64) at C:\Users\Administrator\.julia\packages\Zygote\lwmfx\src\compiler\interface.jl:252
 [42] gradient(::Function, ::Zygote.Params) at C:\Users\Administrator\.julia\packages\Zygote\lwmfx\src\compiler\interface.jl:59
 [43] macro expansion at C:\Users\Administrator\.julia\packages\Flux\lwA90\src\optimise\train.jl:101 [inlined]
 [44] macro expansion at C:\Users\Administrator\.julia\packages\Juno\n6wyj\src\progress.jl:134 [inlined]
 [45] train!(::Function, ::Zygote.Params, ::Base.Iterators.Take{Base.Iterators.Repeated{Tuple{}}}, ::ADAM; cb::Flux.Optimise.var"#40#46") at C:\Users\Administrator\.julia\packages\Flux\lwA90\src\optimise\train.jl:99
 [46] train!(::Function, ::Zygote.Params, ::Base.Iterators.Take{Base.Iterators.Repeated{Tuple{}}}, ::ADAM) at C:\Users\Administrator\.julia\packages\Flux\lwA90\src\optimise\train.jl:97
 [47] top-level scope at In[6]:4
 [48] include_string(::Function, ::Module, ::String, ::String) at .\loading.jl:1091
@manyfeatures
Copy link

manyfeatures commented Apr 12, 2021

Is your custom layer supposed to be trainable?
If it is, then you can try this changes

Flux.@functor S
dudt = Chain(
             Dense(2,30,tanh),
             Dense(30,10),
             S(10, 2)
             )

@xk-y
Copy link
Author

xk-y commented Apr 13, 2021

Is your custom layer supposed to be trainable?
If it is, then you can try this changes

Flux.@functor S
dudt = Chain(
             Dense(2,30,tanh),
             Dense(30,10),
             S(10, 2)
             )

@manyfeatures Sorry about the confusion but the trainable parameters are the two dense layers and S.a, while S.b is supposed to be constant. I tried these changes and am so excited it works. But in this case, the S.b is also trainable. Would it be possible to set S.b constant as ones matrix?

@ChrisRackauckas
Copy link
Member

Just use what the neural de layers use:

https://github.com/SciML/DiffEqFlux.jl/blob/master/src/neural_de.jl#L3

@xk-y
Copy link
Author

xk-y commented Apr 13, 2021

Just use what the neural de layers use:

https://github.com/SciML/DiffEqFlux.jl/blob/master/src/neural_de.jl#L3

I added Flux.trainable(n_ode::NeuralODE) = (n_ode.p[1:680],) to discard the parameters for S.b, which is p[681:684]. But the trainable parameters p[1:680] are no longer updated any more after training. Not sure whether I'm on the right track or not..

@manyfeatures
Copy link

@xiaoky97 try this lines

Flux.@functor S
Flux.trainable(m::S) = (m.a,)

@xk-y
Copy link
Author

xk-y commented Apr 13, 2021

@xiaoky97 try this lines

Flux.@functor S
Flux.trainable(m::S) = (m.a,)

I added these lines. It seems that everything in n_ode.p were updated/optimized after training, including parameters for S.b.

Before training:

n_ode.p
684-element Array{Float64,1}:
  0.22733722627162933
  0.1176697239279747
 -0.20044240355491638
 -0.21971604228019714
 -0.1068674623966217
  0.058977559208869934
 -0.2988882064819336
  0.22008971869945526
  0.33627834916114807
 -0.22047773003578186
 -0.0683562308549881
  0.0496516078710556
 -0.24828743934631348
  ⋮
 -0.03780757635831833
 -0.15476474165916443
  0.4237285554409027
  0.6132223010063171
 -0.257615864276886
  0.4997043311595917
  0.6707265973091125
  0.47114259004592896
  1.0
  1.0
  1.0
  1.0

After training:

n_ode.p
684-element Array{Float64,1}:
  0.23627712591174457
  0.12667363772442986
 -0.20922610767877436
 -0.22885661962913595
 -0.11591636730640568
  0.05010215822262923
 -0.2906225309465301
  0.21227882904991238
  0.33064791463318
 -0.21223436401179294
 -0.05953418671160028
  0.041722864276791054
 -0.23964741768817518
  ⋮
 -0.03025592698041343
 -0.14730673499066896
  0.41710093813784727
  0.6067225540155883
 -0.24463441010926146
  0.5126830113748064
  0.6793566715082799
  0.4796710998477039
  1.01215804390836
  1.0087105538443921
  0.9872641653140869
  0.9887771559217287

@manyfeatures
Copy link

manyfeatures commented Apr 13, 2021

What do you see printing parameters?

ps = Flux.params(dudt)

for p in ps
    println(length(p))
end

My code keeps S.b fix.
Why do you even have 684 weights?
In your network there is 2*30 + 30 + 30*10+10 + 20 + 4 = 424 weights
Full code I run:

using DiffEqFlux, OrdinaryDiffEq, Flux
using Flux
#Prepare true data
u0 = [2.;0.]
datasize = 30
tspan = (0.0f0,1.5f0)
function trueODEfunc(du,u,p,t)
    true_A = [-0.1 2.0; -2.0 -0.1]
    du .= ((u.^3)'true_A)'
end
t = range(tspan[1],tspan[2],length=datasize)
prob = ODEProblem(trueODEfunc,u0,tspan)
ode_data = Array(solve(prob,Tsit5(),saveat=t))

#build neural ode model
struct S
    a
    b
end
function (s::S)(x)
    transpose(transpose(s.a) * s.b) * x
end
S(m::Integer,n::Integer, init=Flux.glorot_uniform) =  S(init(n,m),ones(n, n))

Flux.@functor S
Flux.trainable(m::S) = (m.a,)


dudt = Chain(
             Dense(2,30,tanh),
             Dense(30,10),
             S(10,2)
             )
n_ode = NeuralODE(dudt,tspan,Tsit5(),saveat=t)


#prepare for the training
ps = Flux.params(dudt)

function predict_n_ode()
    n_ode(u0)
end
loss_n_ode() = Flux.mse(ode_data,predict_n_ode())
loss_n_ode()#I got 2.5051380830609693
data = Iterators.repeated((), 2)
opt = ADAM(0.1)

#train the model
Flux.train!(loss_n_ode, ps, data, opt)
# end

for p in n_ode.p
    println(p)
end

@xk-y
Copy link
Author

xk-y commented Apr 13, 2021

What do you see printing parameters?

ps = Flux.params(dudt)

for p in ps
    println(length(p))
end

My code keeps S.b fix.
Why do you even have 684 weights?
In your network there is 230 + 30 + 3010+10 + 20 + 4 = 424 weights
Full code I run:

using DiffEqFlux, OrdinaryDiffEq, Flux
using Flux
#Prepare true data
u0 = [2.;0.]
datasize = 30
tspan = (0.0f0,1.5f0)
function trueODEfunc(du,u,p,t)
    true_A = [-0.1 2.0; -2.0 -0.1]
    du .= ((u.^3)'true_A)'
end
t = range(tspan[1],tspan[2],length=datasize)
prob = ODEProblem(trueODEfunc,u0,tspan)
ode_data = Array(solve(prob,Tsit5(),saveat=t))

#build neural ode model
struct S
    a
    b
end
function (s::S)(x)
    transpose(transpose(s.a) * s.b) * x
end
S(m::Integer,n::Integer, init=Flux.glorot_uniform) =  S(init(n,m),ones(n, n))

Flux.@functor S
Flux.trainable(m::S) = (m.a,)


dudt = Chain(
             Dense(2,30,tanh),
             Dense(30,10),
             S(10,2)
             )
n_ode = NeuralODE(dudt,tspan,Tsit5(),saveat=t)


#prepare for the training
ps = Flux.params(dudt)

function predict_n_ode()
    n_ode(u0)
end
loss_n_ode() = Flux.mse(ode_data,predict_n_ode())
loss_n_ode()#I got 2.5051380830609693
data = Iterators.repeated((), 2)
opt = ADAM(0.1)

#train the model
Flux.train!(loss_n_ode, ps, data, opt)
# end

for p in n_ode.p
    println(p)
end

Yes, the parameters are as you said. In this way, it seems that all the parameters are fixed? Since the loss and p are the same before and after training.

@manyfeatures
Copy link

manyfeatures commented Apr 13, 2021

Ok, this is weird. In NeuralODE layer parameters return again after destructure

_p, re = Flux.destructure(dudt) # part from NeuralODE Layer
_p1  = Flux.params(dudt)
println("Original network _p $(length(_p))") # 424 
println("Original network _p1 $(length(_p1))") # prints 5 arrays,  420 params in total

@ChrisRackauckas
Copy link
Member

Is this a Flux.jl functor issue? Did something change with its interface? @DhairyaLGandhi

@DhairyaLGandhi
Copy link
Member

Haven't seen anything to suggest that. You can also try with

@functor S (a,)

Please also see what you get from Flux.params(s)

@xk-y
Copy link
Author

xk-y commented Apr 14, 2021

I change the code from

Flux.@functor S
Flux.trainable(m::S) = (m.a,)

to:

Flux.@functor S (a,)

Then I got:

Flux.params(S)  #I got Params([])
Flux.train!(loss_n_ode, ps, data, opt)#I got DimensionMismatch error
DimensionMismatch("array could not be broadcast to match destination")

Stacktrace:
 [1] check_broadcast_shape at .\broadcast.jl:520 [inlined]
 [2] check_broadcast_axes at .\broadcast.jl:523 [inlined]
 [3] instantiate at .\broadcast.jl:269 [inlined]
 [4] materialize! at .\broadcast.jl:848 [inlined]
 [5] materialize!(::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}, ::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1},Nothing,typeof(identity),Tuple{Array{Float64,1}}}) at .\broadcast.jl:845
 [6] _vecjacobian!(::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}, ::Array{Float64,1}, ::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}, ::Array{Float32,1}, ::Float32, ::DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}}, ::DiffEqSensitivity.ZygoteVJP, ::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}, ::Nothing) at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\derivative_wrappers.jl:295
 [7] _vecjacobian! at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\derivative_wrappers.jl:194 [inlined]
 [8] #vecjacobian!#20 at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\derivative_wrappers.jl:147 [inlined]
 [9] (::DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}})(::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float32,1}, ::Float32) at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\interpolating_adjoint.jl:90
 [10] ODEFunction at C:\Users\Administrator\.julia\packages\SciMLBase\WHspn\src\scimlfunctions.jl:334 [inlined]
 [11] initialize!(::OrdinaryDiffEq.ODEIntegrator{Tsit5,true,Array{Float64,1},Nothing,Float32,Array{Float32,1},Float32,Float64,Float32,Array{Array{Float64,1},1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},true,Array{Float32,1},ODEFunction{true,DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}},LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Symbol,CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}},Tuple{Symbol},NamedTuple{(:callback,),Tuple{CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}}}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{true,DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}},LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5Cache{Array{Float64,1},Array{Float64,1},Array{Float64,1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}}},DiffEqBase.DEStats},ODEFunction{true,DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}},LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},OrdinaryDiffEq.Tsit5Cache{Array{Float64,1},Array{Float64,1},Array{Float64,1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},OrdinaryDiffEq.DEOptions{Float64,Float64,Float64,Float32,typeof(DiffEqBase.ODE_DEFAULT_NORM),typeof(LinearAlgebra.opnorm),Nothing,CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}},typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN),typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE),typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK),DataStructures.BinaryHeap{Float32,Base.Order.ForwardOrdering},DataStructures.BinaryHeap{Float32,Base.Order.ForwardOrdering},Nothing,Nothing,Int64,Array{Float32,1},Array{Float64,1},Tuple{}},Array{Float64,1},Float64,Nothing,OrdinaryDiffEq.DefaultInit}, ::OrdinaryDiffEq.Tsit5Cache{Array{Float64,1},Array{Float64,1},Array{Float64,1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}}) at C:\Users\Administrator\.julia\packages\OrdinaryDiffEq\5egkj\src\perform_step\low_order_rk_perform_step.jl:623
 [12] __init(::ODEProblem{Array{Float64,1},Tuple{Float32,Float32},true,Array{Float32,1},ODEFunction{true,DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}},LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Symbol,CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}},Tuple{Symbol},NamedTuple{(:callback,),Tuple{CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}}}}},SciMLBase.StandardODEProblem}, ::Tsit5, ::Tuple{}, ::Tuple{}, ::Tuple{}, ::Type{Val{true}}; saveat::Array{Float64,1}, tstops::Array{Float32,1}, d_discontinuities::Tuple{}, save_idxs::Nothing, save_everystep::Bool, save_on::Bool, save_start::Bool, save_end::Nothing, callback::CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}}, dense::Bool, calck::Bool, dt::Float32, dtmin::Nothing, dtmax::Float32, force_dtmin::Bool, adaptive::Bool, gamma::Rational{Int64}, abstol::Float64, reltol::Float64, qmin::Rational{Int64}, qmax::Int64, qsteady_min::Int64, qsteady_max::Int64, qoldinit::Rational{Int64}, fullnormalize::Bool, failfactor::Int64, beta1::Nothing, beta2::Nothing, maxiters::Int64, internalnorm::typeof(DiffEqBase.ODE_DEFAULT_NORM), internalopnorm::typeof(LinearAlgebra.opnorm), isoutofdomain::typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN), unstable_check::typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK), verbose::Bool, timeseries_errors::Bool, dense_errors::Bool, advance_to_tstop::Bool, stop_at_next_tstop::Bool, initialize_save::Bool, progress::Bool, progress_steps::Int64, progress_name::String, progress_message::typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE), userdata::Nothing, allow_extrapolation::Bool, initialize_integrator::Bool, alias_u0::Bool, alias_du0::Bool, initializealg::OrdinaryDiffEq.DefaultInit, kwargs::Base.Iterators.Pairs{Symbol,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool},Tuple{Symbol},NamedTuple{(:sense,),Tuple{DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}}}) at C:\Users\Administrator\.julia\packages\OrdinaryDiffEq\5egkj\src\solve.jl:433
 [13] #__solve#404 at C:\Users\Administrator\.julia\packages\OrdinaryDiffEq\5egkj\src\solve.jl:4 [inlined]
 [14] solve_call(::ODEProblem{Array{Float64,1},Tuple{Float32,Float32},true,Array{Float32,1},ODEFunction{true,DiffEqSensitivity.ODEInterpolatingAdjointSensitivityFunction{DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats},Nothing,ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing}},LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Symbol,CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}},Tuple{Symbol},NamedTuple{(:callback,),Tuple{CallbackSet{Tuple{},Tuple{DiscreteCallback{DiffEqCallbacks.var"#61#64"{Array{Float32,1}},DiffEqCallbacks.var"#62#65"{DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},DiffEqCallbacks.var"#63#66"{typeof(DiffEqBase.INITIALIZE_DEFAULT),Bool,Array{Float32,1},DiffEqSensitivity.ReverseLossCallback{Array{Float64,1},Array{Float32,1},Array{Float64,1},Base.RefValue{Int64},LinearAlgebra.UniformScaling{Bool},DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon},DiffEqSensitivity.AdjointDiffCache{Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Base.OneTo{Int64},UnitRange{Int64},LinearAlgebra.UniformScaling{Bool}}}},typeof(DiffEqBase.FINALIZE_DEFAULT)}}}}}},SciMLBase.StandardODEProblem}, ::Tsit5; merge_callbacks::Bool, kwargs::Base.Iterators.Pairs{Symbol,Any,NTuple{7,Symbol},NamedTuple{(:save_everystep, :save_start, :saveat, :tstops, :abstol, :reltol, :sense),Tuple{Bool,Bool,Array{Float64,1},Array{Float32,1},Float64,Float64,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}}}) at C:\Users\Administrator\.julia\packages\DiffEqBase\rN9Px\src\solve.jl:61
 [15] #solve_up#58 at C:\Users\Administrator\.julia\packages\DiffEqBase\rN9Px\src\solve.jl:82 [inlined]
 [16] #solve#57 at C:\Users\Administrator\.julia\packages\DiffEqBase\rN9Px\src\solve.jl:70 [inlined]
 [17] _adjoint_sensitivities(::ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats}, ::DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool}, ::Tsit5, ::DiffEqSensitivity.var"#df#146"{Array{Float64,2},Array{Float64,1},Colon}, ::Array{Float32,1}, ::Nothing; abstol::Float64, reltol::Float64, checkpoints::Array{Float32,1}, corfunc_analytical::Nothing, callback::Nothing, kwargs::Base.Iterators.Pairs{Symbol,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool},Tuple{Symbol},NamedTuple{(:sense,),Tuple{DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}}}) at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\sensitivity_interface.jl:28
 [18] adjoint_sensitivities(::ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float32,1},Array{Array{Array{Float64,1},1},1},ODEProblem{Array{Float64,1},Tuple{Float32,Float32},false,Array{Float32,1},ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},SciMLBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,DiffEqFlux.var"#dudt_#87"{NeuralODE{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}},Array{Float32,1},Flux.var"#61#63"{Chain{Tuple{Dense{typeof(tanh),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}},S}}},Tuple{Float32,Float32},Tuple{Tsit5},Base.Iterators.Pairs{Symbol,StepRangeLen{Float32,Float64,Float64},Tuple{Symbol},NamedTuple{(:saveat,),Tuple{StepRangeLen{Float32,Float64,Float64}}}}}},LinearAlgebra.UniformScaling{Bool},Nothing,typeof(DiffEqFlux.basic_tgrad),Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,typeof(SciMLBase.DEFAULT_OBSERVED),Nothing},Array{Array{Float64,1},1},Array{Float32,1},Array{Array{Array{Float64,1},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float32}},DiffEqBase.DEStats}, ::Tsit5, ::Vararg{Any,N} where N; sensealg::DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool}, kwargs::Base.Iterators.Pairs{Symbol,Union{Nothing, DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}},Tuple{Symbol,Symbol},NamedTuple{(:callback, :sense),Tuple{Nothing,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}}}) at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\sensitivity_interface.jl:6
 [19] (::DiffEqSensitivity.var"#adjoint_sensitivity_backpass#145"{Base.Iterators.Pairs{Symbol,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool},Tuple{Symbol},NamedTuple{(:sense,),Tuple{DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}}},Tsit5,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},Array{Float32,1},Tuple{},NamedTuple{(:sense,),Tuple{DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}},Colon})(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\DiffEqSensitivity\agdxc\src\concrete_solve.jl:179
 [20] #263#back at C:\Users\Administrator\.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:65 [inlined]
 [21] #178 at C:\Users\Administrator\.julia\packages\Zygote\RxTZu\src\lib\lib.jl:194 [inlined]
 [22] (::Zygote.var"#1698#back#180"{Zygote.var"#178#179"{DiffEqBase.var"#263#back#74"{DiffEqSensitivity.var"#adjoint_sensitivity_backpass#145"{Base.Iterators.Pairs{Symbol,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool},Tuple{Symbol},NamedTuple{(:sense,),Tuple{DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}}},Tsit5,DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},Bool,Bool},Array{Float64,1},Array{Float32,1},Tuple{},NamedTuple{(:sense,),Tuple{DiffEqSensitivity.InterpolatingAdjoint{0,true,Val{:central},DiffEqSensitivity.ZygoteVJP,Bool}}},Colon}},Tuple{NTuple{6,Nothing},Tuple{Nothing}}}})(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:59
 [23] #solve#57 at C:\Users\Administrator\.julia\packages\DiffEqBase\rN9Px\src\solve.jl:70 [inlined]
 [24] (::typeof((#solve#57)))(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\RxTZu\src\compiler\interface2.jl:0
 [25] (::Zygote.var"#178#179"{typeof((#solve#57)),Tuple{NTuple{6,Nothing},Tuple{Nothing}}})(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\RxTZu\src\lib\lib.jl:194
 [26] (::Zygote.var"#1698#back#180"{Zygote.var"#178#179"{typeof((#solve#57)),Tuple{NTuple{6,Nothing},Tuple{Nothing}}}})(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:59
 [27] (::typeof((solve##kw)))(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\RxTZu\src\compiler\interface2.jl:0
 [28] (::Zygote.var"#178#179"{typeof((solve##kw)),Tuple{Tuple{Nothing,Nothing,Nothing},Tuple{Nothing}}})(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\RxTZu\src\lib\lib.jl:194
 [29] #1698#back at C:\Users\Administrator\.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:59 [inlined]
 [30] NeuralODE at C:\Users\Administrator\.julia\packages\DiffEqFlux\woVeV\src\neural_de.jl:69 [inlined]
 [31] (::typeof((λ)))(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\RxTZu\src\compiler\interface2.jl:0
 [32] NeuralODE at C:\Users\Administrator\.julia\packages\DiffEqFlux\woVeV\src\neural_de.jl:65 [inlined]
 [33] (::typeof((λ)))(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\RxTZu\src\compiler\interface2.jl:0
 [34] predict_n_ode at .\In[2]:41 [inlined]
 [35] (::typeof((predict_n_ode)))(::Array{Float64,2}) at C:\Users\Administrator\.julia\packages\Zygote\RxTZu\src\compiler\interface2.jl:0
 [36] loss_n_ode at .\In[2]:43 [inlined]
 [37] (::typeof((loss_n_ode)))(::Float64) at C:\Users\Administrator\.julia\packages\Zygote\RxTZu\src\compiler\interface2.jl:0
 [38] #178 at C:\Users\Administrator\.julia\packages\Zygote\RxTZu\src\lib\lib.jl:194 [inlined]
 [39] #1698#back at C:\Users\Administrator\.julia\packages\ZygoteRules\OjfTt\src\adjoint.jl:59 [inlined]
 [40] #39 at C:\Users\Administrator\.julia\packages\Flux\qp1gc\src\optimise\train.jl:102 [inlined]
 [41] (::Zygote.var"#69#70"{Zygote.Params,Zygote.Context,typeof((#39))})(::Float64) at C:\Users\Administrator\.julia\packages\Zygote\RxTZu\src\compiler\interface.jl:252
 [42] gradient(::Function, ::Zygote.Params) at C:\Users\Administrator\.julia\packages\Zygote\RxTZu\src\compiler\interface.jl:59
 [43] macro expansion at C:\Users\Administrator\.julia\packages\Flux\qp1gc\src\optimise\train.jl:101 [inlined]
 [44] macro expansion at C:\Users\Administrator\.julia\packages\Juno\n6wyj\src\progress.jl:134 [inlined]
 [45] train!(::Function, ::Zygote.Params, ::Base.Iterators.Take{Base.Iterators.Repeated{Tuple{}}}, ::ADAM; cb::Flux.Optimise.var"#40#46") at C:\Users\Administrator\.julia\packages\Flux\qp1gc\src\optimise\train.jl:99
 [46] train!(::Function, ::Zygote.Params, ::Base.Iterators.Take{Base.Iterators.Repeated{Tuple{}}}, ::ADAM) at C:\Users\Administrator\.julia\packages\Flux\qp1gc\src\optimise\train.jl:97
 [47] top-level scope at In[4]:1
 [48] include_string(::Function, ::Module, ::String, ::String) at .\loading.jl:1091

@xk-y
Copy link
Author

xk-y commented Apr 17, 2021

Hi @DhairyaLGandhi I am not quite sure whether this is a Flux or DiffEqFlux error..... But I am still having the DimensionMismatch error even if I try @functor S (a,) as you recommended.

@ctessum
Copy link

ctessum commented Apr 23, 2021

In case it's any help, I get the same error when I remove the final S layer from the neural network:

dudt = Chain(
             Dense(2,30,tanh),
             Dense(30,10),
             #S(10,2)
             )

However, when I additionally change the output dimension of the second Dense layer to the correct value:

dudt = Chain(
             Dense(2,30,tanh),
             Dense(30,2), # 2 instead of 10
             )

that fixes the problem.

So it seems like DiffEqFlux may be somehow leaving out the S layer on the backward pass, which doesn't happen on the forward pass or the backward pass of Flux by itself (i.e. we can train dudt by itself with no problem).

@ChrisRackauckas
Copy link
Member

Are you sure DiffEqFlux is involved at all here and that this isn't a property of differentiating this Chain with S in it? Can you try and isolate or narrow down the example a bit? I don't feel like it needs half of the complexities in the issue. I am a bit behind and trying to get back on track, so some help identifying whether this is just a Flux thing would be helpful.

@ctessum
Copy link

ctessum commented Apr 30, 2021

Thanks for your help with this! Here is another try at a reproducer:

import Flux, DiffEqFlux, OrdinaryDiffEq
using Test

# Create a layer that is fully trainable
struct TrainAll
    a
    b
end
(s::TrainAll)(x) = transpose(transpose(s.a) * s.b) * x
TrainAll(m::Integer,n::Integer, init=Flux.glorot_uniform) =  TrainAll(init(n,m),ones(n, n))
Flux.@functor TrainAll

# Create a layer that that has some data that is not trainable
struct TrainSome
    a
    b
end
(s::TrainSome)(x) = transpose(transpose(s.a) * s.b) * x
TrainSome(m::Integer,n::Integer, init=Flux.glorot_uniform) =  TrainSome(init(n,m),ones(n, n))
Flux.@functor TrainSome (a, ) # This line is the only difference between TrainSome and TrainAll

# Create chains that include the two layer types
dudt_all = Flux.Chain(Flux.Dense(2,10),TrainAll(10,2))
dudt_some = Flux.Chain(Flux.Dense(2,10),TrainSome(10,2))

# Create functions for loss and training
loss(nn) = sum(nn([2.;0.]))
function train(nn, f_chain, bfixed::Bool)
    chain = f_chain(nn)
    preloss = loss(nn)
    presuma = sum(chain[2].a)
    presumb = sum(chain[2].b)
    Flux.train!(() -> loss(nn), Flux.params(nn), Iterators.repeated((), 2), Flux.ADAM(0.1))
    # If training worked, loss should be different (less, even!) after training, and weights
    # of custom layer should either be the same or different, depending on if they're supposed
    # to be trainable or not.
    @test loss(nn) < preloss
    chain = f_chain(nn)
    @test sum(chain[2].a) != presuma
    if bfixed
        @test sum(chain[2].b) == presumb
    else
        @test sum(chain[2].b) != presumb
    end
end


# Train training the two chains, just by themselves. (They should both work.)
train(dudt_all, (nn) -> nn, false)
train(dudt_some, (nn) -> nn, true)


# Create neural ODEs using the two chains.
node_all = DiffEqFlux.NeuralODE(dudt_all,[0, 1.5],OrdinaryDiffEq.Tsit5(), saveat=[0,1.5])
node_some = DiffEqFlux.NeuralODE(dudt_some,[0, 1.5],OrdinaryDiffEq.Tsit5(), saveat=[0,1.5])

# Try to train the two neural ODEs.
# The first ODE runs okay but fails our test because the custom layer
# parameters don't update for some reason.
train(node_all, (nn) -> nn.model, false)
# The second ODE throws a dimension mismatch error somewhere in Flux.train!
train(node_some, (nn) -> nn.model, true)

So what this code shows as far as I can tell is that the custom layers that are either fully trainable or partially trainable both work in Flux by itself. However, when the two layers are used as part of a neural ODE, the fully trainable layer allows the function to run correctly but its weights don't seem to get updated, and the partially trainable layer causes the dimension mismatch error.

Of course, it's also possible that there's an error somewhere in the code above so that it's not showing what I think it's showing.

@ChrisRackauckas
Copy link
Member

I don't have any ideas. Maybe @DhairyaLGandhi knows something about how we're using functors.

@ctessum
Copy link

ctessum commented May 14, 2021

For what it's worth, a workaround is to make the non-trainable parameter b a global variable, so this:

b = ones(2, 2)
struct TrainSome
    a
end
(s::TrainSome)(x) = transpose(transpose(s.a) * b) * x

instead of this:

struct TrainSome
    a
    b
end
Flux.@functor TrainSome (a, )
(s::TrainSome)(x) = transpose(transpose(s.a) * s.b) * x

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants