-
Notifications
You must be signed in to change notification settings - Fork 664
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Autograd tests for Transforms #1414
Comments
@mthrok |
@yoyololicon Thanks. Let me know if you need help there. |
I will be working on |
Working on |
I don't think |
Hi @yoyololicon |
I'm taking a look at |
I will try |
I'm going to take a stab at |
Now all the Transforms with autograd supports are properly tested. Thanks for the help! |
…1414) * Exclude audio_preprocessing_tutorial.py on Windows * Fix * Update build_for_windows.sh Co-authored-by: Brian Johnson <[email protected]>
Until recent, we have been assumed that ops provided in torchaudio support autograd just because they are implemented with PyTorch. However, this assumption was not correct all the time. For example, in #704, it was pointed out that
lfilter
does not support autograd, and this was resolved in #1310 with proper unit tests by community contribution. Similarly, as a part of #1337, I have added autograd test to some transforms in #1340. We would like to extend the autograd testing to as many functionals and transforms as possible. We would like to ask your help.Steps
NOTE Please first try adding test without providing
nondet_tol
to see if the transform's backward pass is deterministic or not. If you seeBackward is not reentrant
error message, that means it is not deterministic. Please report it back here, so that we can discuss how to handle it.(cd test && pytest torchaudio_unittest/transforms/autograd_*_test.py)
.cc @mthrok
in the description (or comment)Note: We are not sure if all the transforms actually support autograd. If you find a Transform does not support autograd please report back in this thread.
For the instruction to set up development environment, please refer to CONTRIBUTING.md.
Transforms
AmplitudeToDB
Add autograd test for T.AmplitudeToDB #1447is used by
MFCC
so it should support but having an independent test is be niceSpectrogram
Add autograd test for T.Spectrogram/T.MelSpectrogram #1340GriffinLim
Add autograd test for T.GriffinLim #1421MelScale
Add autograd test for T.MelScale #1467is used by
MelSpectrogram
so it should support but having an independent test is be niceInverseMelScale
this transform most likely does not support autograd
MelSpectrogram
Add autograd test for T.Spectrogram/T.MelSpectrogram #1340MFCC
Autograd tests for Transforms MFCC #1415MuLawEncoding
MuLawDecoding
MuLaw
s are quantization operation, thus they do not support autograd.Resample
Add Autograd test for T.Resample #1416ComputeDeltas
Add autograd test for T.ComputeDeltas #1422TimeStretch
Add autograd test to T.TimeStretch (and F.phase_vocoder) #1420NOTE:
TimeStretch
requires coordinate transform from cartesian to polar which usesatan2
andatan2
is not differentiable around zero.Fade
Add autograd test for T.Fade #1424FrequencyMasking
Add autograd tests for TimeMasking/FrequencyMasking #1498TimeMasking
Add autograd tests for TimeMasking/FrequencyMasking #1498Vol
Add autograd test for T.Vol #1460SlidingWindowCmn
Adding test for T.SlidingWindowCmn #1482SpectralCentroid
Add autograd test for T.SpectralCentroid #1425Vad
this transform most likely does not support autograd
The text was updated successfully, but these errors were encountered: