You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Recently we've discussed implementing automatic differentiation (AD) in FTorch using the underlying autograd AD tool built into Torch. The first step to achieve this would be to overload elementary operations for the Fortran torch_tensor class: assignment, addition, multiplication of tensors, multiplication of tensor by scalar, etc.
The text was updated successfully, but these errors were encountered:
Related to #111.
Recently we've discussed implementing automatic differentiation (AD) in FTorch using the underlying autograd AD tool built into Torch. The first step to achieve this would be to overload elementary operations for the Fortran
torch_tensor
class: assignment, addition, multiplication of tensors, multiplication of tensor by scalar, etc.The text was updated successfully, but these errors were encountered: