-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AutoGrad error in backprop when iterating dict #71
Comments
Here is another error, maybe related to mean/abs2/broadcast using AutoGrad, Knet
m = [rand(1,3), rand(1)]
x,y = rand(3,4),rand(1,4)
pred(m,x) = m[1]*x .+ m[2]
loss(m,x,y) = mean(abs2, pred(m,x) .- y)
∇ = grad(loss)
∇(m,x,y) # OK
loss(m,x,y) = mean(abs2, pred(m,x) .- y) + mean(mean.(abs2, m))
∇ = grad(loss)
∇(m,x,y) # Error
WARNING: abs2(x::AbstractArray{T}) where T <: Number is deprecated, use abs2.(x) instead.
Stacktrace:
[1] depwarn(::String, ::Symbol) at ./deprecated.jl:70
[2] abs2(::Array{Float64,2}) at ./deprecated.jl:57
[3] (::AutoGrad.##rfun#7#10{Base.#abs2})(::Array{Any,1}, ::Function, ::AutoGrad.Rec{Array{Float64,2}}, ::Vararg{AutoGrad.Rec{Array{Float64,2}},N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:124
[4] abs2(::AutoGrad.Rec{Array{Float64,2}}) at ./<missing>:0
[5] mean(::Base.#abs2, ::AutoGrad.Rec{Array{Array{Float64,N} where N,1}}) at ./statistics.jl:25
[6] loss(::AutoGrad.Rec{Array{Array{Float64,N} where N,1}}, ::Array{Float64,2}, ::Array{Float64,2}) at ./REPL[8]:1
[7] forward_pass(::Function, ::Tuple{Array{Array{Float64,N} where N,1},Array{Float64,2},Array{Float64,2}}, ::Array{Any,1}, ::Int64) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:88
[8] (::AutoGrad.##gradfun#1#3{#loss,Int64})(::Array{Any,1}, ::Function, ::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:39
[9] (::AutoGrad.#gradfun#2)(::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:39
[10] eval(::Module, ::Any) at ./boot.jl:235
[11] eval_user_input(::Any, ::Base.REPL.REPLBackend) at ./REPL.jl:66
[12] macro expansion at ./REPL.jl:97 [inlined]
[13] (::Base.REPL.##1#2{Base.REPL.REPLBackend})() at ./event.jl:73
while loading no file, in expression starting on line 0
ERROR: DimensionMismatch("dimensions must match")
Stacktrace:
[1] promote_shape(::Tuple{Base.OneTo{Int64},Base.OneTo{Int64}}, ::Tuple{Base.OneTo{Int64}}) at ./indices.jl:84
[2] +(::Array{Float64,2}, ::Array{Float64,1}) at ./arraymath.jl:38
[3] (::AutoGrad.##rfun#7#10{Base.#+})(::Array{Any,1}, ::Function, ::AutoGrad.Rec{Array{Float64,2}}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:124
[4] +(::AutoGrad.Rec{Array{Float64,2}}, ::AutoGrad.Rec{Array{Float64,1}}) at ./<missing>:0
[5] mean(::Base.#abs2, ::AutoGrad.Rec{Array{Array{Float64,N} where N,1}}) at ./statistics.jl:29
[6] loss(::AutoGrad.Rec{Array{Array{Float64,N} where N,1}}, ::Array{Float64,2}, ::Array{Float64,2}) at ./REPL[8]:1
[7] forward_pass(::Function, ::Tuple{Array{Array{Float64,N} where N,1},Array{Float64,2},Array{Float64,2}}, ::Array{Any,1}, ::Int64) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:88
[8] (::AutoGrad.##gradfun#1#3{#loss,Int64})(::Array{Any,1}, ::Function, ::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:39
[9] (::AutoGrad.#gradfun#2)(::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:39
loss(m,x,y) = mean(abs2, pred(m,x) .- y) + mean(sum.(abs2.(m)))
∇ = grad(loss)
∇(m,x,y) # Error
ERROR: MethodError: no method matching start(::AutoGrad.Broadcasted{AutoGrad.Rec{Array{Array{Float64,N} where N,1}}})
Closest candidates are:
start(::SimpleVector) at essentials.jl:258
start(::Base.MethodList) at reflection.jl:560
start(::ExponentialBackOff) at error.jl:107
...
Stacktrace:
[1] mapfoldl(::Base.#identity, ::Function, ::AutoGrad.Broadcasted{AutoGrad.Rec{Array{Array{Float64,N} where N,1}}}) at ./reduce.jl:67
[2] (::##1#2)(::AutoGrad.Broadcasted{AutoGrad.Rec{Array{Array{Float64,N} where N,1}}}) at ./<missing>:0
[3] broadcast(::Function, ::AutoGrad.Rec{Array{Array{Float64,N} where N,1}}) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/unfuse.jl:35
[4] loss(::AutoGrad.Rec{Array{Array{Float64,N} where N,1}}, ::Array{Float64,2}, ::Array{Float64,2}) at ./REPL[16]:1
[5] forward_pass(::Function, ::Tuple{Array{Array{Float64,N} where N,1},Array{Float64,2},Array{Float64,2}}, ::Array{Any,1}, ::Int64) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:88
[6] (::AutoGrad.##gradfun#1#3{#loss,Int64})(::Array{Any,1}, ::Function, ::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:39
[7] (::AutoGrad.#gradfun#2)(::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:39
|
The last 2 examples work for me on master in this form loss(m,x,y) = mean(abs2, pred(m,x) .- y) + mean(sum.(abs2, m))
loss(m,x,y) = mean(abs2, pred(m,x) .- y) + mean(mean.(abs2, m)) |
On julia 0.7 and on the branch of PR #78 the first example now gives the following error: julia> using AutoGrad, Knet, Statistics
julia> u = [rand(2,3), rand(2)]
2-element Array{Array{Float64,N} where N,1}:
[0.920865 0.820558 0.804323; 0.873658 0.993891 0.010203]
[0.362208, 0.69425]
julia> v = [rand(1,2), rand(1)]
2-element Array{Array{Float64,N} where N,1}:
[0.454112 0.564783]
[0.274256]
julia> m = Dict(:u=>u, :v=>v)
Dict{Symbol,Array{Array{Float64,N} where N,1}} with 2 entries:
:v => Array{Float64,N} where N[[0.454112 0.564783], [0.274256]]
:u => Array{Float64,N} where N[[0.920865 0.820558 0.804323; 0.873658 0.993891 0.010203], [0.362208, 0.69425]]
julia> x,y = rand(3,4),rand(1,4)
([0.916855 0.51435 0.784969 0.451743; 0.36581 0.904977 0.912647 0.385057; 0.828762 0.730146 0.321203 0.251844], [0.383026 0.280737 0.599814 0.245899])
julia> pred(m,x) = foldl((x,w)->w[1]*x .+ w[2], [m[:u],m[:v]], init=x)
pred (generic function with 1 method)
julia> l2(ws) = mean(mean.(abs2, ws))
l2 (generic function with 1 method)
julia> loss(m,x,y) = mean(abs2, pred(m,x) .- y) + mean(l2.(collect(values(m))))
loss (generic function with 1 method)
julia> loss(m,x,y)
3.84326612440536
julia> grad(loss)(m,x,y)
ERROR: MethodError: no method matching sum_outgrads(::Array{Array{Float64,N} where N,1}, ::Array{Any,1})
Closest candidates are:
sum_outgrads(::Nothing, ::Any) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:491
sum_outgrads(::AbstractArray{T,N} where N, ::AbstractArray{T,N} where N) where T at /home/carlo/.julia/dev/AutoGrad/src/core.jl:478
sum_outgrads(::Rec, ::Any) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:482
...
Stacktrace:
[1] sum_outgrads(::Dict{Symbol,Array{Array{Float64,N} where N,1}}, ::AutoGrad.UngetIndex) at /home/carlo/.julia/dev/AutoGrad/src/getindex.jl:92
[2] backward_pass(::Rec{Dict{Symbol,Array{Array{Float64,N} where N,1}}}, ::Rec{Float64}, ::Array{AutoGrad.Node,1}) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:246
[3] (::getfield(AutoGrad, Symbol("##gradfun#1#2")){typeof(loss),Int64})(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Dict{Symbol,Array{Array{Float64,N} where N,1}}, ::Vararg{Any,N} where N) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:40
[4] (::getfield(AutoGrad, Symbol("#gradfun#3")){getfield(AutoGrad, Symbol("##gradfun#1#2")){typeof(loss),Int64}})(::Dict{Symbol,Array{Array{Float64,N} where N,1}}, ::Vararg{Any,N} where N) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:39
[5] top-level scope at none:0 The gradient of each of the two terms of the loss loss(m,x,y) = mean(abs2, pred(m,x) .- y)
loss(m,x,y) = mean(l2.(collect(values(m)))) is computed correctly, the problem arises when they are summed. |
Relaxing the signature sum_outgrads(a::AbstractArray{T},b::AbstractArray{T}) where T = ... to sum_outgrads(a::AbstractArray{T},b::AbstractArray) where T = ... solves the problem julia> loss(m,x,y) = mean(abs2, pred(m,x) .- y) + mean(l2.(collect(values(m))))
loss (generic function with 1 method)
julia> grad(loss)(m,x,y)
Dict{Symbol,Array{Array{Float64,N} where N,1}} with 2 entries:
:v => Array{Float64,N} where N[[6.68047 6.28584], [5.20279]]
:u => Array{Float64,N} where N[[2.23497 2.9622 1.94912; 1.88972 2.05421 1.44816], [4.49943, 3.18227]] but this is a bad hack. We have to understand why inference is failing and we have an Array{Any} in this error ERROR: MethodError: no method matching sum_outgrads(::Array{Array{Float64,N} where N,1}, ::Array{Any,1}) |
Fixed in latest master, please test and close. |
I have another problem, here is the code:
And the problem is :
|
This seems to be a different issue. Reopening to investigate when I find time. |
May be related to this closed issue denizyuret/Knet.jl#109
The text was updated successfully, but these errors were encountered: