-
Notifications
You must be signed in to change notification settings - Fork 104
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement gradient for SVD #614
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #614 +/- ##
==========================================
+ Coverage 80.78% 80.81% +0.02%
==========================================
Files 162 162
Lines 46757 46800 +43
Branches 11440 11449 +9
==========================================
+ Hits 37773 37820 +47
+ Misses 6735 6733 -2
+ Partials 2249 2247 -2
|
Yes, but what's the reason for |
For fun reference: https://people.maths.ox.ac.uk/gilesm/files/NA-08-01.pdf |
207f55e
to
bb9b02b
Compare
I'm very confident the gradients are implemented correctly, but I'm getting shape errors in the gradient check for the non-square case. As usual, there's also no complex support -- I didn't even bother to add the I am hoping to get a 2nd pair of eyes on it to see if I'm missing something obvious. I wonder if it's related to the use of shared variables in |
That shouldn't be a problem. I'll try to have a look, but that utility take a while to debug. |
False alarm, I had a dumb mistake in the gradients. This should be good to go now. I still hate shared variables. |
6802239
to
3f1f902
Compare
f2e33dc
to
fb4e5b5
Compare
fb4e5b5
to
06f4e4e
Compare
0651cc0
to
99a703c
Compare
99a703c
to
196b5e4
Compare
Implement gradients for SVD
Description
I just copied the implementation from autograd here. The base case
compute_uv=False
,full_matrices=False
works and test passes.Marked as a draft because I'm not sure how to handle multiple output gradients in a test. I'm not sure my copy/paste job is correct until I verify (there are some slight differences between our SVD and the reference I was following -- I think we return
V
, but they returnedV.conj().T
. This is a numpy vs matlab thing I think.Even if we compute all the matrices, we only get a gradient for the singular values. Is there a way to return
[NoGrad, GradientGraph, NoGrad]
?Related Issue
SVD
#56Checklist
Type of change