-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MisconfigurationException "MultiStepLR" with torch 2.0 #15912
Comments
Thank you @teamclouday, this is something I bumped into as well. I'll look into it shortly |
Thank you for resolving this! Closing |
The same issue exists for CosineAnnealingLR lightning_lite.utilities.exceptions.MisconfigurationException: The provided lr scheduler |
@lantiga |
@mviti seeing this now If you're a Lightning Lite user, the project has evolved into Fabric since 1.9.0: https://pytorch-lightning.readthedocs.io/en/stable/fabric/fabric.html |
@mactavish91 just to understand the context, are you using transformers with the lightning Trainer? Can you post minimal code to reproduce? The reason these issues happen is that PyTorch 2.0 changed the hierarchy of scheduler classes. They were all deriving from |
@lantiga Yes, the code is
the error message is: |
When I use lr_scheduler_step rather than configure the scheduler in configure_optimizers, the lr remains unchanged at 0.0 and I don't know why.
|
I have the same issue with
|
Any further update on this issue ❓ I am using |
@mviti There is the same solution about CosineAnnealingLR.Modify file optimizer.py in Pytorch_lightning(My version is 1.8.1):
|
I have the same problem with OneCycleLR. This is part of my code:
And when I start with the training code block, I receive the next issue: My Pytorch version can be seen here: How can I fix that issue?? EDIT: I change in lightning/src/pytorch_lightning/utilities/types.py LRSchedulerTypeTuple for torch.optim.lr_scheduler.LRScheduler but I have the same issue. What can I do? |
Bug description
Got following error after upgrading to torch 2.0:
lightning_lite.utilities.exceptions.MisconfigurationException: The provided lr scheduler
MultiStepLR
doesn't follow PyTorch's LRScheduler API. You should override theLightningModule.lr_scheduler_step
hook with your own logic if you are using a custom LR scheduler.How to reproduce the bug
Error messages and logs
Environment
Current environment
More info
The error came from
https://github.com/Lightning-AI/lightning/blob/6cc493360d9dfdd132665343d6611e66e9760885/src/pytorch_lightning/core/optimizer.py#L351
LRSchedulerTypeTuple
is usingtorch.optim.lr_scheduler._LRScheduler
to identify torch schedulershttps://github.com/Lightning-AI/lightning/blob/6cc493360d9dfdd132665343d6611e66e9760885/src/pytorch_lightning/utilities/types.py#L114
However, pytorch has changed
_LRScheduler
to be subclass ofLRScheduler
(link)And
MultiStepLR
is subclass ofLRScheduler
as well (link)A fix could be change to use
torch.optim.lr_scheduler.LRScheduler
inLRSchedulerTypeTuple
Updated by @akihironitta: This issue is related to #15894.
The text was updated successfully, but these errors were encountered: