Remove optimizer for DeepSpeed inference #10640
Labels
feature
Is an improvement or enhancement
refactor
strategy: deepspeed
won't fix
This will not be worked on
Refactor/Feature
Recently in DeepSpeed it was merged to support making the optimizer optional for inference microsoft/DeepSpeed#1514.
Currently in our plugin we pass in an optimizer always: https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/plugins/training_type/deepspeed.py#L573. This is required to partition the parameters appropriately, however this shouldn't be needed now :)
We should wait till a release is made for DeepSpeed. Also we need to be cautious if the user uses a later version of deepspeed and does require an optimizer to be passed. @carmocca do you think we need a deprecation process for this?
Motivation
Cleaner code, and probably less hacky.
cc @Borda @justusschock @awaelchli @akihironitta @SeanNaren
The text was updated successfully, but these errors were encountered: