-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
problem with the unreliable approximation of Core.Compiler.return_type
#35800
Comments
|
Reduced:
So |
I don't know what's going on yet, but I would guess not, since specialization is supposed to be orthogonal to inference; it only affects what we generate code for, not what we know about types. |
I think a bigger problem is the context dependency of (I'm just mentioning it since, IIUC, Cthulhu.jl had a similar "bug" and at some point it was fixed.) |
Yes, everybody agrees this is a bug. |
Thanks. I was commenting because your comment #35800 (comment) seems to be focusing on a single call of |
Another "context dependent" behavior in #35537 |
Another very strange inference problem has popped up in JuliaSIMD/LoopVectorization.jl#114 |
In case it helps, that inference failure was solved by replacing this: @generated function vzero(::Type{Vec{W,T}}) where {W,T}
typ = llvmtype(T)
vtyp = "<$W x $typ>"
instrs = """
ret $vtyp zeroinitializer
"""
quote
$(Expr(:meta,:inline))
Base.llvmcall($instrs, Vec{$W,$T}, Tuple{}, )
end
end
@inline vzero(::Val{W}, ::Type{T}) where {W,T} = SVec(vzero(Vec{W,T})) with @generated function vzero(::Val{W}, ::Type{T}) where {W,T}
typ = llvmtype(T)
vtyp = "<$W x $typ>"
instrs = """
ret $vtyp zeroinitializer
"""
quote
$(Expr(:meta,:inline))
SVec(Base.llvmcall($instrs, Vec{$W,$T}, Tuple{}, ))
end
end EDIT: |
A bit more information: First this is called:
which eventually calls
which appears to be recursion, so it's widened to
which infers to |
The order dependence happens because if we have already inferred e.g. |
(cherry picked from commit 903542b)
(cherry picked from commit 903542b)
This is a fairly deep issue, and there is no way to promise fixing it within the 1.6 timeframe. |
JuliaLang/julia#35800 which can cause very slow compile times in Julia 1.6.
This seems to work properly on current master. If others agree, perhaps we can close. |
Well, if we're "lucky" it might work, but the underlying issue is not fixed 😄 |
The root problem here is unreliability of For those who are interested in this issue, this reproducer (adapted from #35537) is nice to inspect on: julia> x = [ones(3) for _ in 1:2];
julia> s1(x) = x ./ sum(x);
julia> testf(x) = s1.(x);
julia> Base.return_types(testf, (typeof(x),)) # poor inference
1-element Vector{Any}:
Any
julia> method = only(methods(Base.Broadcast.combine_eltypes))
combine_eltypes(f, args::Tuple) in Base.Broadcast at broadcast.jl:717
julia> mi = Core.Compiler.specialize_method(method, Tuple{typeof(Base.Broadcast.combine_eltypes),typeof(s1),Tuple{Vector{Vector{Float64}}}}, Core.svec())
MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}})
julia> mi.cache # no cache yet
ERROR: UndefRefError: access to undefined reference
Stacktrace:
[1] getproperty(x::Core.MethodInstance, f::Symbol)
@ Base ./Base.jl:37
[2] top-level scope
@ REPL[16]:1
julia> testf(x); # actual execution, create internal caches with precise type information
julia> s1(x) = x ./ sum(x); # invalidate the poorly inferred cache
julia> Base.return_types(testf, (typeof(x),)) # woa!
1-element Vector{Any}:
Vector{Vector{Float64}} (alias for Array{Array{Float64, 1}, 1})
julia> mi.cache # now we have a cache
Core.CodeInstance(MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}}), Core.CodeInstance(MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}}), #undef, 0x0000000000007f6d, 0x0000000000007f6e, Type{Vector{Float64}}, Vector{Float64}, nothing, 0x000000aa, 0x000000aa, nothing, false, false, Ptr{Nothing} @0x000000010165db10, Ptr{Nothing} @0x0000000000000000, 0x00), 0x0000000000007f6f, 0xffffffffffffffff, Type{Vector{Float64}}, Vector{Float64}, nothing, 0x000000aa, 0x000000aa, nothing, false, false, Ptr{Nothing} @0x000000010165db10, Ptr{Nothing} @0x0000000000000000, 0x00)
julia> mi.cache.rettype
Type{Vector{Float64}} |
Core.Compiler.return_type
Hi! I found something strange (at least for me) and hope it helps towards finding the cause of the issue. I apologize this turns out to be something trivial and unuseful (this is the first time I dig into the inference system). I get the following results for the reproducer only when using Julia 1.8.0-beta3 and loading DifferentialEquations.jl (which by chance I had loaded when I first run this reproducer). If I don't load DifferentialEquations.jl or use Julia versions 1.6.6 or 1.7.2, I get the same results @aviatesk showed. julia> using DifferentialEquations
julia> x = [ones(3) for _ in 1:2];
julia> s1(x) = x ./ sum(x);
julia> testf(x) = s1.(x);
julia> Base.return_types(testf, (typeof(x),)) # in this case the inference seems to work
1-element Vector{Any}:
Vector{Vector{Float64}} (alias for Array{Array{Float64, 1}, 1})
julia> method = only(methods(Base.Broadcast.combine_eltypes))
combine_eltypes(f, args::Tuple) in Base.Broadcast at broadcast.jl:717
julia> mi = Core.Compiler.specialize_method(method, Tuple{typeof(Base.Broadcast.combine_eltypes),typeof(s1),Tuple{Vector{Vector{Float64}}}}, Core.svec())
MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}})
julia> mi.cache
Core.CodeInstance(MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}}), #undef, 0x0000000000007f0e, 0xffffffffffffffff, Type{Vector{Float64}}, Vector{Float64}, nothing, 0x00000180, 0x00000180, nothing, false, false, Ptr{Nothing} @0x000000011027f900, Ptr{Nothing} @0x0000000000000000, 0x00)
julia> mi.cache.rettype
Type{Vector{Float64}}
julia> testf(x); # actual execution, create internal caches with precise type information
julia> s1(x) = x ./ sum(x); # invalidate the previous inferred cache
julia> Base.return_types(testf, (typeof(x),))
1-element Vector{Any}:
Vector{Vector{Float64}} (alias for Array{Array{Float64, 1}, 1})
julia> mi.cache # notice that this cache is different from the previous one
Core.CodeInstance(MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}}), Core.CodeInstance(MethodInstance for Base.Broadcast.combine_eltypes(::typeof(s1), ::Tuple{Vector{Vector{Float64}}}), #undef, 0x0000000000007f0e, 0x0000000000007f0f, Type{Vector{Float64}}, Vector{Float64}, nothing, 0x00000180, 0x00000180, nothing, false, false, Ptr{Nothing} @0x000000011027f900, Ptr{Nothing} @0x0000000000000000, 0x00), 0x0000000000007f10, 0xffffffffffffffff, Type{Vector{Float64}}, Vector{Float64}, nothing, 0x00000180, 0x00000180, nothing, false, false, Ptr{Nothing} @0x000000011027f900, Ptr{Nothing} @0x0000000000000000, 0x00)
julia> mi.cache.rettype
Type{Vector{Float64}}
julia> VERSION
v"1.8.0-beta3"
(test_return_type) pkg> st
Status `~/test_return_type/Project.toml`
[0c46a032] DifferentialEquations v7.1.0 |
Hi! I also tried most of the examples of this issue in Julia 1.8.0-rc1 and seems to work well. I don't know if it is still just luck or if the bug has been already fixed. If not, what are the prospects for a fix? This bug might be preventing me from going ahead with some research (without having to go back to Python which I'm trying to resist as hard as I can 😄) |
Using Julia master (newer than 301db97), the following does not infer the correct type (see also the discussion in #34048):
Note that the third line works if you comment out the second line (start from a fresh Julia session), which makes testing this issue difficult.
The text was updated successfully, but these errors were encountered: