Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test AD on more algorithms #196

Merged
merged 9 commits into from
Jan 3, 2024
Merged

Test AD on more algorithms #196

merged 9 commits into from
Jan 3, 2024

Conversation

lxvm
Copy link
Collaborator

@lxvm lxvm commented Nov 3, 2023

Goal is to address #150 and #57.
TODO:

  • Loop through all algorithms in AD tests
  • Add FastGaussQuadrature to AD tests
  • Fix any bugs that come up

@lxvm lxvm marked this pull request as ready for review January 3, 2024 10:06
@lxvm lxvm changed the title [WIP] Test AD on more algorithms Test AD on more algorithms Jan 3, 2024
@lxvm
Copy link
Collaborator Author

lxvm commented Jan 3, 2024

@ChrisRackauckas This pr should be good to go! The test suite is updated to test various algorithms for the various problem types in the library on various kinds of inputs. There were some issues in Zygote that I made upstream prs to fix that would simplify the test suite, but I managed to write differentiable integrands.

Also, I didn't add the monte carlo algorithms to these tests because the randomness slows the convergence, and perhaps the primal and dual computations should be done in the same library call to ensure that the same quadrature is used, which I believe is also an approach StochasticAD uses.

@ChrisRackauckas
Copy link
Member

This looks great!

Also, I didn't add the monte carlo algorithms to these tests because the randomness slows the convergence, and perhaps the primal and dual computations should be done in the same library call to ensure that the same quadrature is used, which I believe is also an approach StochasticAD uses.

What's the issue though with CubaCuhre? I'm a bit surprised to see that one lumped in there.

Indeed, when the algorithm can be differentiated through, I think it is more stable to have it use the same points so we should probably have a trait that effectively passes the duals along into those algorithms. Those will be only the pure Julia ones though, which is a rather small subset that we can opt-in.

@ChrisRackauckas ChrisRackauckas merged commit 7f71009 into SciML:master Jan 3, 2024
6 of 8 checks passed
@lxvm lxvm deleted the adtests branch January 3, 2024 16:23
@lxvm
Copy link
Collaborator Author

lxvm commented Jan 3, 2024

I didn't pay attention to CubaCuhre specifically, but there was a Cuba algorithm tested that didn't pass the convergence test comparing the finite differences to ForwardDiff gradients that requests a tolerance of 3 digits. Thinking about it, finite differences on the MC algorithms is even worse since each integral evaluation has its own random seed, and the same is true of comparing different ADs. Anyway, that can be address in a follow-up pr.

Regarding the passing of duals into the integrand in the forward pass, it is already implemented and assumes that the library supports vector-valued integrands.

@lxvm lxvm mentioned this pull request Jan 3, 2024
@lxvm lxvm mentioned this pull request Mar 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants