-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add assertions to TPU accelerator for device availability #11799
Conversation
136d174
to
88e65e8
Compare
@@ -98,6 +98,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/). | |||
- Added a `_Stateful` support for `LightningDataModule` ([#11637](https://github.com/PyTorchLightning/pytorch-lightning/pull/11637)) | |||
|
|||
|
|||
- Added checks to `TPUAccelerator.setup_environment` to assert device availability ([#11799](https://github.com/PyTorchLightning/pytorch-lightning/pull/11799)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- Added checks to `TPUAccelerator.setup_environment` to assert device availability ([#11799](https://github.com/PyTorchLightning/pytorch-lightning/pull/11799)) | |
- Added checks to `TPUAccelerator` to assert TPU device availability ([#11799](https://github.com/PyTorchLightning/pytorch-lightning/pull/11799)) |
""" | ||
Raises: | ||
MisconfigurationException: | ||
If the TPU device is not available. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If the TPU device is not available. | |
If the `torch_xla` is not installed. |
MisconfigurationException: | ||
If the TPU device is not available. | ||
""" | ||
if not _XLA_AVAILABLE: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we add another one for TPU_AVAILABLE?
#3104
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM !
If the TPU device is not available. | ||
""" | ||
if not _XLA_AVAILABLE: | ||
raise MisconfigurationException("The TPU Accelerator requires torch_xla and a TPU device to run.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
raise MisconfigurationException("The TPU Accelerator requires torch_xla and a TPU device to run.") | |
raise MisconfigurationException("The TPU Accelerator requires the `torch_xla` library and `TPU devices` to run.") |
Closing out in favor of the approach in #11797 |
What does this PR do?
Add assertions to TPU accelerator for device availability
This aims to simplify the accelerator connector logic and rewrite effort in #11448
This would move the assertion logic from Trainer constructor checks to runtime at trainer.fit/validate/test/predict calls
Related PR: #11797
Does your PR introduce any breaking changes? If yes, please list them.
Before submitting
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃