-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move the content of ad.jl
from Turing.jl
to here
#571
Conversation
ad.jl
and ad_utils.jl
from Turing.jl
to heread.jl
and ad_utils.jl
from Turing.jl
to here
src/logdensityfunction.jl
Outdated
|
||
# AD related code | ||
getADType(spl::Sampler) = getADType(spl.alg) | ||
getADType(::SampleFromPrior) = ADTypes.AutoForwardDiff(; chunksize=0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know this is copied from Turing but I'd like to mention anyway that somehow this seems wrong - I think either we should remove it (is it actually needed?) or make it possible to adjust the AD type (similar to the HMC algs it could be saved as a field of the struct).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I left these functions in Turing.
In DynamicPPL
we should probably only define two-arg ADgradient
functions, so it's always required to say what ADType to use.
ad.jl
and ad_utils.jl
from Turing.jl
to heread.jl
from Turing.jl
to here
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Looks like the changes break something, will investigate later |
Pull Request Test Coverage Report for Build 7775567755
💛 - Coveralls |
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## master #571 +/- ##
==========================================
+ Coverage 84.32% 84.37% +0.05%
==========================================
Files 26 28 +2
Lines 3183 3207 +24
==========================================
+ Hits 2684 2706 +22
- Misses 499 501 +2 ☔ View full report in Codecov by Sentry. |
@yebai @torfjelde @devmotion tests are passing, another look? |
Co-authored-by: Tor Erlend Fjelde <[email protected]>
ad.jl
from Turing.jl
to heread.jl
from Turing.jl
to here
Okay, more errors, fixing... |
The error with When given this as the input, the gradients will be initialized as |
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
I disabled the Zygote tests -- they just seem too temperamental, the tests with Another look @torfjelde? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Starting to look real nice! A few more changes, but after that I think we'll be good to go:) Nice work!
Might be nice to have @devmotion have a look though as he's been more involved in the transition to ADTypes
Co-authored-by: Tor Erlend Fjelde <[email protected]>
@torfjelde thanks for the suggestions and the help in debugging earlier. I made the updates accordingly. @devmotion a quick look maybe? |
Thanks @sunxd3 and @torfjelde! |
Just a heads up wrt. these types of changes for the future: we usually make these releases breaking if we worry that it might have downstream effects (even when the change itself is not technically breaking) so as to avoid issues like TuringLang/Turing.jl#2173 |
@torfjelde breaking for DynamicPPL or both? |
Twin PR from
Turing.jl
Description:
ADTypes
andLogDensityProblemsAD
direct deps, which should be fine given their small sizesForwardDiff
andReverseDiff
to automateLogDensityProblemsAD.ADgradient(::ADType, ::LogDensityFunction)
TestUtils.DEMO_MODELS