Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Paper Feedback: Analytic methods for derivative calculations #16

Closed
hayesall opened this issue Jul 13, 2021 · 2 comments
Closed

Paper Feedback: Analytic methods for derivative calculations #16

hayesall opened this issue Jul 13, 2021 · 2 comments
Assignees

Comments

@hayesall
Copy link
Contributor

The "Statement of need" (lines 19-22) mentioned that existing deep learning frameworks can speed up computation, but cannot perform analytical derivatives needed for the trial function method implemented by nnde.

Screenshot of the paper, where it shows something similar to the text summarized above.

Can this point be clarified or expanded?

For example: PyTorch implements methods for automatic differentiation (autograd) and numerical methods to compute gradients. What is an example where these are insufficient?

@taless474
Copy link

taless474 commented Jul 28, 2021

Would you please explain about extra functionalities of nnde with this version of Autograd (not the PyTorch's one) and JAX considering JOSS guidelines?

@elwinter
Copy link
Owner

We updated the paper to address this issue by describing how the code was started before the widespread adoption of TensorFlow 2. We began to convert the software to use TensorFlow 2 several months ago, and early results are promising, so we wanted to document this particular stage in the project prior to further development.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants