We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
In a similar vein to #35800, I ran into emit_invoke codegen depending on the state of the inference cache:
emit_invoke
; julia> @noinline child(@nospecialize(i)) = i ; julia> kernel(a) = (child(a); return) ; julia> code_llvm(kernel, Tuple{Int}; optimize=false) ; @ REPL[2]:1 within `kernel' define void @julia_kernel_94(i64) { top: %1 = call %jl_value_t*** @julia.ptls_states() %2 = bitcast %jl_value_t*** %1 to %jl_value_t** %3 = getelementptr inbounds %jl_value_t*, %jl_value_t** %2, i64 4 %4 = bitcast %jl_value_t** %3 to i64** %5 = load i64*, i64** %4 %6 = call %jl_value_t* @jl_box_int64(i64 signext %0) %7 = call cc38 nonnull %jl_value_t* bitcast (%jl_value_t* (%jl_value_t*, %jl_value_t**, i32, %jl_value_t*)* @jl_invoke to %jl_value_t* (%jl_value_t*, %jl_value_t*, %jl_value_t*)*)(%jl_value_t* inttoptr (i64 139913193714448 to %jl_value_t*), %jl_value_t* inttoptr (i64 139913178153096 to %jl_value_t*), %jl_value_t* %6) ret void } ; julia> code_llvm(child, Tuple{Any}; optimize=false); # populate the cache ; julia> code_llvm(kernel, Tuple{Int}; optimize=false) ; @ REPL[2]:1 within `kernel' define void @julia_kernel_159(i64) { top: %1 = call %jl_value_t*** @julia.ptls_states() %2 = bitcast %jl_value_t*** %1 to %jl_value_t** %3 = getelementptr inbounds %jl_value_t*, %jl_value_t** %2, i64 4 %4 = bitcast %jl_value_t** %3 to i64** %5 = load i64*, i64** %4 %6 = call %jl_value_t* @jl_box_int64(i64 signext %0) %7 = call cc37 nonnull %jl_value_t* bitcast (%jl_value_t* (%jl_value_t*, %jl_value_t**, i32)* @j1_child_160 to %jl_value_t* (%jl_value_t*, %jl_value_t*)*)(%jl_value_t* inttoptr (i64 139913178153096 to %jl_value_t*), %jl_value_t* %6) ret void }
Looks like this was added in #25984. I'm not sure it's wanted / worth it?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
In a similar vein to #35800, I ran into
emit_invoke
codegen depending on the state of the inference cache:Looks like this was added in #25984. I'm not sure it's wanted / worth it?
The text was updated successfully, but these errors were encountered: