torch.autograd.grad #2015
Unanswered
kudoryutaro
asked this question in
Q&A
Replies: 1 comment
-
You could potentially use autograd as a pytorch op in conjunction with TensorRT ops via torch-tensorrt. But gradients will not be propagated through the TensorRT components. So like in this use case, I'm not sure it would work since you need the information from sum. @frank-wei do you have any ideas about how we could handle cases like this? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Can I use autograd.grad with torch-tensorrt?
We pass the coordinates of the atoms to the model, calculate the potential energy, and then differentiate the potential energy by pos to calculate the force. in model.forward(), we use torch.autograd.grad, but Is it possible to transform this model with torch-tensorRT?
Pseudo Code
class Model():
def forward(pos: torch.Tensor):
potential_energy = torch.sum(pos) # potential_energy is scalar
force = torch.autograd.grad(potential_energy, pos)
return potential_energy, force
Beta Was this translation helpful? Give feedback.
All reactions