You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Passing `x` as a keyword argument means that the gradient operator will be "prepared" for the specific type and size of the array `x`. This can speed up further evaluations on similar inputs, but will likely cause errors if the new inputs have a different type or size. With `AutoReverseDiff`, it can also yield incorrect results if the logdensity contains value-dependent control flow.
If you want to use another backend from [ADTypes.jl](https://github.com/SciML/ADTypes.jl) which is not in the list above, you need to load [DifferentiationInterface.jl](https://github.com/gdalle/DifferentiationInterface.jl) first.
Pfft, I was using :Mooncake instead of AutoMooncake(). 😓 Let me edit to make this better. Edited.
penelopeysm
changed the title
Unclear error message when DifferentiationInterface isn't loaded
Improve error message when DifferentiationInterface isn't loaded
Nov 16, 2024
Since #39 was merged, it's now possible to run e.g.
ADgradient(ADTypes.AutoMooncake(),...)
using DifferentiationInterface. 🎉This is documented here:
LogDensityProblemsAD.jl/ext/LogDensityProblemsADADTypesExt.jl
Lines 25 to 29 in e3401f2
but I wonder if one could define a catch-all method, mirroring the one here
LogDensityProblemsAD.jl/src/LogDensityProblemsAD.jl
Lines 66 to 69 in e3401f2
which could mention that if you're expecting your AD backend to work via DI, then try importing DI?
The text was updated successfully, but these errors were encountered: