-
Notifications
You must be signed in to change notification settings - Fork 899
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] TCN model cannot be saved when used with callbacks #2638
Comments
Thanks for raising this issue @MarcBresson. It indeed comes from the new parametrized weight norm for which PyTorch doesn't support (pickle) serialization but only in combination with Callbacks (I'll explain further below). TLDR; You can store the model if you remove the callback before calling
Explanation: To avoid these issues, we already prevent pickling through
Now (unfortunately), after training, the callbacks themselves also have a reference to the LightningModule ( You can fix this issue by removing the callbacks before storing the model. It has been in our backlog to add the option to remove the trainer parameters, training series, and other nonessential objects before saving. I'll move it higher up the priority. |
Ok thank you. What is the role of the --EDIT-- to clarify, it seems like most of the info available in trainer_params are also available under the |
There are three things:
Hope this clears things up. |
Thank you very much. It is crystal clear. It seems like pytorch lightning checkpoints contain a lot of info, but it's probably not enough to recreate the |
No worries :) Indeed, it's a great tool! |
Describe the bug
For some reason, defining callbacks in pl_trainer_kwargs will make the torch.save function of TCN.save method save the lightning module. Because TCN contains Parametrized layers, torch will raise an error
RuntimeError: Serialization of parametrized modules is only supported through state_dict().
Not setting any custom callbacks will not trigger the bug.
To Reproduce
will output
System (please complete the following information):
Additional context
It likely comes from #2593
The text was updated successfully, but these errors were encountered: