-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extending autodiff compatibilities #132
Comments
The internal implementation of ForwardDiff's |
I advise against integrating new functionality, as optim is being rewritten (NLSolvers.jl) |
No one has needed it or asked for it. We did support ReverseDiff but for a long time that was broken so I took it out.
I guess we could use the jacobian directly If the gradient was available.
Yes, but if you want to help out, I'm not at a point where I'm adding AD to NLSolvers.jl |
Since I considered extending the autodiff compatibilities, I came across with SciML/GalacticOptim.jl which has already included all of those I was planning to, so I am not sure anymore whether there would be a gain in including it in NLSolvers.jl. Alternatively, one could try to use what was done in GalacticOptim.jl in the upstream packages such as NLSolvers |
I was considering extending the autodiff compatibilities of NLSolverBase and PR the result back. I wanted to include ReverseDiff.jl, Zygote.jl and maybe also forward-over-reverse for the Hessian.
Is there any reason why this has not been done yet that I should be aware before starting? I don't want to implement it just to discover that there is a reason why it has not been done.
Also, is there a rationale for the fact that, for twicedifferentiable.jl when the gradient is provided, the hessian is calculated using the jacobian of the gradient for the finitediff case, but it calculated using the hessian of the function for forwardiff (lines 63 and 69)?
The text was updated successfully, but these errors were encountered: