Skip to content

Warning during save_hyperparameter() gives misleading advice? #13615

Discussion options

You must be logged in to vote

the attributes that are not saved as hparams need to be passed explicitly. Considering you are using load_from_checkpoint API, you can use model = MyModule.load_from_checkpoint(ckpt_path, model=model).

If you include it in the hparams, your checkpoints will be unnecessarily big and can create issues if you have large models.

By

is already saved during checkpointing.

it means the model weights are already saved in the checkpoint and are loaded using PyTorch API, not as hparams.

Replies: 8 comments 6 replies

Comment options

You must be logged in to vote
1 reply
@adosar
Comment options

Answer selected by hogru
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@hogru
Comment options

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
2 replies
@LukasK13
Comment options

@SuryaThiru
Comment options

Comment options

You must be logged in to vote
2 replies
@e-yi
Comment options

@fses91
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lightningmodule pl.LightningModule pl Generic label for PyTorch Lightning package
10 participants