-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Julia] Interface Improvements #12
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall this looks great. Just a few small suggestions and one main question about typing some arguments (I asked the question at one function, but the ideas will carry over to other functions).
What's going on with the edit on the python/test/test_bridgestan.py
file?
function param_constrain!( | ||
sm::StanModel, | ||
theta_unc, | ||
out::Vector{Float64}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I honestly don't know what to type theta_unc
and out
as, or if typing them at all is best.
The issue is that both theta_unc
and out
can be any type T
such that T
is a subtype of AbstractVector
, with elements of Float64
, and the elements are contiguous in memory. It's the contiguous in memory part that I don't know how to type. The rest of it would be
function param_constrain!(sm::StanModel, theta_unc::T, out::T; ...) where {T <: AbstractVector{Float64}}
...
end
Thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not that familiar with Julia's more exotic types, can you give an example of something which is <: AbstractVector{Float64}
, has contiguous memory, but is not Vector{Float64}
? I'd like to include a test with that thing if we expand this type.
This is also the kind of thing which we could change in a backwards-compatible way later on
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, you're right. Being stricter now, by forcing Vector{Float64}
, will allow us to accept a broader set of types in the future in a backwards-compatible way, if we need/want to. Good point.
For now we still want to explicitly avoid something like
M = rand(3, 3);
@views M[1, :] # not contiguous
being passed in.
Will you add the type Vector{Float64}
to the arguments theta_unc
, theta
, and q
? I think all out
arguments have it already.
Views, and less important for our use case ranges, are both <: AbstractVector{Float64}
, have contiguous memory, and are not Vector{Float64}
. Try
x = rand(10);
typeof(@views x[1:2]) <: AbstractVector{Float64}
typeof(@views x[1:2])
y = 1.0:10.0;
typeof(y) <: AbstractVector{Float64}
typeof(y)
And for future reference, here's a discussion of this issue with a temporary solution.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As was just revealed to me by a test that broke when fixing those types, it seems that Julia also does the right thing when given a Matrix{Float64}
to a C call. So really, down the road, we don't even need AbstractVector, but just AbstractArray and continuous?
Any thoughts on the edit of the python/test/test_bridgestan.py file? |
test_bridgestan.py: While copying the tests over to Julia I noticed a copy/paste error in the test. The chunk I deleted is an exact duplicate of the chunk above it, see bridgestan/python/test/test_bridgestan.py Lines 348 to 367 in 8e2c286
|
Closes #11