You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 27, 2020. It is now read-only.
Thank you for your comment. I am new to pytorch and your explanation is not very clear for me, can you elaborate on that? To my knowledge nn.Parameters are included in model.parameters() so that they are updated with each call of optimizer.step() (what do you mean by forever?) I suppose that Wci,Wcf,Wco also should be trainable in the network.
-if using requires_grad flag, torch.save will not save parameters in deed although it may be updated every epoch.
-Howerver, if you use declare an optimizer like Adam(model.parameters()),
-parameters will not be updated forever.
Is there any reason for initializing Wci_o_f in ConvLSTMCell as autograd Variables rather than nn.Parameters
The text was updated successfully, but these errors were encountered: