Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Disable calling Tensor.requires_grad_() inside a functorch transform #849

Merged
merged 1 commit into from
Jun 3, 2022

Commits on Jun 1, 2022

  1. Disable calling Tensor.requires_grad_() inside a functorch transform

    Fixes #847
    
    We do not allow users to call requires_grad_() inside a functorch
    transform. This is because the user is effectively saying
    "hey, I want another layer of autograd if I call requires_grad_()", but
    that doesn't actually work because to set up a layer of autograd we need
    to do some work (e.g. push autograd onto the DynamicLayerStack).
    
    Instead, when a user calls requires_grad_() (and similarly retain_grad),
    we raise a nice error message.
    
    This has the intended consequence of causing
    torch.autograd.functional.{jvp, vjp, jacobian} to error out when called
    inside of a functorch transform. Users should use the functorch
    equivalent.
    
    Test Plan:
    - added tests
    zou3519 committed Jun 1, 2022
    Configuration menu
    Copy the full SHA
    50e675f View commit details
    Browse the repository at this point in the history