Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Julia] Interface Improvements #12

Merged
merged 11 commits into from
Sep 13, 2022
Merged

[Julia] Interface Improvements #12

merged 11 commits into from
Sep 13, 2022

Conversation

WardBrian
Copy link
Collaborator

Closes #11

@WardBrian WardBrian requested a review from roualdes September 9, 2022 19:10
Copy link
Owner

@roualdes roualdes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall this looks great. Just a few small suggestions and one main question about typing some arguments (I asked the question at one function, but the ideas will carry over to other functions).

What's going on with the edit on the python/test/test_bridgestan.py file?

julia/src/BridgeStan.jl Show resolved Hide resolved
julia/src/BridgeStan.jl Outdated Show resolved Hide resolved
function param_constrain!(
sm::StanModel,
theta_unc,
out::Vector{Float64};
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I honestly don't know what to type theta_unc and out as, or if typing them at all is best.

The issue is that both theta_unc and out can be any type T such that T is a subtype of AbstractVector, with elements of Float64, and the elements are contiguous in memory. It's the contiguous in memory part that I don't know how to type. The rest of it would be

function param_constrain!(sm::StanModel, theta_unc::T, out::T; ...) where {T <: AbstractVector{Float64}}
...
end

Thoughts?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not that familiar with Julia's more exotic types, can you give an example of something which is <: AbstractVector{Float64}, has contiguous memory, but is not Vector{Float64}? I'd like to include a test with that thing if we expand this type.

This is also the kind of thing which we could change in a backwards-compatible way later on

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, you're right. Being stricter now, by forcing Vector{Float64}, will allow us to accept a broader set of types in the future in a backwards-compatible way, if we need/want to. Good point.

For now we still want to explicitly avoid something like

M = rand(3, 3);
@views M[1, :] # not contiguous

being passed in.

Will you add the type Vector{Float64} to the arguments theta_unc, theta, and q? I think all out arguments have it already.

Views, and less important for our use case ranges, are both <: AbstractVector{Float64}, have contiguous memory, and are not Vector{Float64}. Try

x = rand(10);
typeof(@views x[1:2]) <: AbstractVector{Float64}
typeof(@views x[1:2])
y = 1.0:10.0;
typeof(y) <: AbstractVector{Float64}
typeof(y)

And for future reference, here's a discussion of this issue with a temporary solution.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As was just revealed to me by a test that broke when fixing those types, it seems that Julia also does the right thing when given a Matrix{Float64} to a C call. So really, down the road, we don't even need AbstractVector, but just AbstractArray and continuous?

julia/src/BridgeStan.jl Outdated Show resolved Hide resolved
julia/src/BridgeStan.jl Outdated Show resolved Hide resolved
julia/src/BridgeStan.jl Outdated Show resolved Hide resolved
julia/src/BridgeStan.jl Outdated Show resolved Hide resolved
@roualdes
Copy link
Owner

Any thoughts on the edit of the python/test/test_bridgestan.py file?

@WardBrian
Copy link
Collaborator Author

test_bridgestan.py: While copying the tests over to Julia I noticed a copy/paste error in the test. The chunk I deleted is an exact duplicate of the chunk above it, see

logdensity, grad = bridge.log_density_gradient(
y_unc_arr, propto=False, jacobian=True
)
np.testing.assert_allclose(
_logp(y_unc) + _propto_false(y_unc) + _jacobian_true(y_unc), logdensity
)
np.testing.assert_allclose(
_grad_logp(y_unc) + _grad_propto_false(y_unc) + _grad_jacobian_true(y_unc),
grad[0],
)
#
logdensity, grad = bridge.log_density_gradient(
y_unc_arr, propto=False, jacobian=True
)
np.testing.assert_allclose(
_logp(y_unc) + _propto_false(y_unc) + _jacobian_true(y_unc), logdensity
)
np.testing.assert_allclose(
_grad_logp(y_unc) + _grad_propto_false(y_unc) + _grad_jacobian_true(y_unc),
grad[0],

@roualdes roualdes merged commit b663d2c into main Sep 13, 2022
@WardBrian WardBrian deleted the feature/11-julia-cleanup branch September 13, 2022 18:16
@WardBrian WardBrian mentioned this pull request Sep 13, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Julia interface improvements
2 participants