-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deprecate the FairScale integration #16353
Conversation
⚡ Required checks status: All passing 🟢Groups summary🟢 pytorch_lightning: Tests workflowThese checks are required after the changes to 🟢 pytorch_lightning: Azure GPU
These checks are required after the changes to 🟢 pytorch_lightning: Azure HPU
These checks are required after the changes to 🟢 pytorch_lightning: Azure IPU
These checks are required after the changes to 🟢 pytorch_lightning: Docs
These checks are required after the changes to 🟢 mypy
These checks are required after the changes to 🟢 installThese checks are required after the changes to Thank you for your contribution! 💜
|
There's an issue with standalone tests hanging, marking as a draft... |
Pull request was converted to draft
b9599d1
to
defaea4
Compare
The hang is caused by with pytest.deprecated_call():
trainer = Trainer()
trainer.fit() when multiple processes are used because the deprecation message is only shown on rank 0, the other ranks fail the |
996de14
to
48c789a
Compare
What does this PR do?
See title.
I haven't added deprecation tests as they are extensively covered in the sharded tests.
All deprecations use the exact same message, so there's no need to check whether it has been shown already. (warnings do this by default)
Does your PR introduce any breaking changes? If yes, please list them.
None
cc @Borda @tchaton @awaelchli