Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

transform! not differentiable #1835

Closed
mschauer opened this issue Jun 2, 2022 · 6 comments
Closed

transform! not differentiable #1835

mschauer opened this issue Jun 2, 2022 · 6 comments
Assignees

Comments

@mschauer
Copy link

mschauer commented Jun 2, 2022

julia> obj, init, trans = optim_objective(model, MAP(); constrained=false)
julia> ForwardDiff.gradient(trans, x0)

errors in

ERROR: TypeError: in typeassert, expected Float64, got a value of type Dual{Nothing, Float64, 9}
Stacktrace:
  [1] setindex!(A::Vector{Float64}, x::Dual{ForwardDiff.Tag{Turing.ModeEstimation.ParameterTransform{DynamicPPL.TypedVarInfo{NamedTuple{(:α, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:α, Setfield.IdentityLens}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:α, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{DiagNormal}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, constrained_space{true}}, Float64}, Float64, 9}, i1::Int64)
    @ Base ./array.jl:903
  [2] _unsafe_copyto!(dest::Vector{Float64}, doffs::Int64, src::Vector{Dual{ForwardDiff.Tag{Turing.ModeEstimation.ParameterTransform{DynamicPPL.TypedVarInfo{NamedTuple{(:α, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:α, Setfield.IdentityLens}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:α, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{DiagNormal}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, constrained_space{true}}, Float64}, Float64, 9}}, soffs::Int64, n::Int64)
    @ Base ./array.jl:253
  [3] unsafe_copyto!
    @ ./array.jl:307 [inlined]
  [4] _copyto_impl!
    @ ./array.jl:331 [inlined]
  [5] copyto!
    @ ./array.jl:317 [inlined]
  [6] copyto!
    @ ./array.jl:343 [inlined]
  [7] copyto!
    @ ./broadcast.jl:954 [inlined]
  [8] copyto!
    @ ./broadcast.jl:913 [inlined]
  [9] materialize!
    @ ./broadcast.jl:871 [inlined]
 [10] materialize!
    @ ./broadcast.jl:868 [inlined]
 [11] macro expansion
    @ ~/.julia/packages/DynamicPPL/e6mZw/src/varinfo.jl:0 [inlined]
 [12] _setall!(metadata::NamedTuple{(:α, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:α, Setfield.IdentityLens}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:α, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{DiagNormal}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, val::Vector{Dual{ForwardDiff.Tag{Turing.ModeEstimation.ParameterTransform{DynamicPPL.TypedVarInfo{NamedTuple{(:α, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:α, Setfield.IdentityLens}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:α, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{DiagNormal}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, constrained_space{true}}, Float64}, Float64, 9}}, start::Int64) (repeats 2 times)
    @ DynamicPPL ~/.julia/packages/DynamicPPL/e6mZw/src/varinfo.jl:341
 [13] setall!
    @ ~/.julia/packages/DynamicPPL/e6mZw/src/varinfo.jl:340 [inlined]
 [14] setindex!
    @ ~/.julia/packages/DynamicPPL/e6mZw/src/varinfo.jl:970 [inlined]
 [15] transform!(p::Vector{Dual{ForwardDiff.Tag{Turing.ModeEstimation.ParameterTransform{DynamicPPL.TypedVarInfo{NamedTuple{(:α, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:α, Setfield.IdentityLens}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:α, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{DiagNormal}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, constrained_space{true}}, Float64}, Float64, 9}}, vi::DynamicPPL.TypedVarInfo{NamedTuple{(:α, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:α, Setfield.IdentityLens}, Int64}, Vector{Gamma{Float64}}, Vector{AbstractPPL.VarName{:α, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{DiagNormal}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, #unused#::constrained_space{true})
    @ Turing.ModeEstimation ~/.julia/packages/Turing/CVKOh/src/modes/ModeEstimation.jl:183
 [16] transform
    @ ~/.julia/packages/Turing/CVKOh/src/modes/ModeEstimation.jl:207 [inlined]
@devmotion
Copy link
Member

That looks like the parameters are saved in a Vector{Float64} which can't be updated with dual numbers. I think maybe this could be solved by not using t.vi in

return transform(p, t.vi, t.space)
but creating a new VarInfo that takes into account the provided parameters p (and hence their type). Or by defining some rule for ForwardDiff.

@yebai
Copy link
Member

yebai commented Apr 2, 2024

@torfjelde is this fixed?

@yebai
Copy link
Member

yebai commented Jun 18, 2024

@mhauru, can you take a look at whether this is now fixed, please?

@mhauru
Copy link
Member

mhauru commented Jun 19, 2024

The mode estimation interface has changed in v0.33 so that the optim_objective, and the trans function that it used to return, don't exist anymore. So this is "fixed" in the same sense that an arm amputation fixes wrist pain. The question is then whether the new replacement arm has a better wrist. @mschauer, do you have more context for where this came up, so that I can understand what the equivalent thing would be with the new interface?

@mschauer
Copy link
Author

If I remember it showed up when I was using Turing to define a model from which I need the gradient of the log-likelihood to be used in my own sampler.

@yebai
Copy link
Member

yebai commented Jun 19, 2024

This is likely fixed; please reopen if a similar issue is found.

@yebai yebai closed this as completed Jun 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants