Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add assertions to TPU accelerator for device availability #11799

Closed
wants to merge 2 commits into from

Conversation

ananthsub
Copy link
Contributor

@ananthsub ananthsub commented Feb 7, 2022

What does this PR do?

Add assertions to TPU accelerator for device availability
This aims to simplify the accelerator connector logic and rewrite effort in #11448

This would move the assertion logic from Trainer constructor checks to runtime at trainer.fit/validate/test/predict calls

Related PR: #11797

Does your PR introduce any breaking changes? If yes, please list them.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • [n/a] Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

pytorch_lightning/accelerators/tpu.py Outdated Show resolved Hide resolved
@ananthsub ananthsub force-pushed the feat/tpu-validation branch from 136d174 to 88e65e8 Compare February 8, 2022 06:37
@mergify mergify bot removed the has conflicts label Feb 8, 2022
@@ -98,6 +98,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Added a `_Stateful` support for `LightningDataModule` ([#11637](https://github.com/PyTorchLightning/pytorch-lightning/pull/11637))


- Added checks to `TPUAccelerator.setup_environment` to assert device availability ([#11799](https://github.com/PyTorchLightning/pytorch-lightning/pull/11799))
Copy link
Contributor

@rohitgr7 rohitgr7 Feb 8, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- Added checks to `TPUAccelerator.setup_environment` to assert device availability ([#11799](https://github.com/PyTorchLightning/pytorch-lightning/pull/11799))
- Added checks to `TPUAccelerator` to assert TPU device availability ([#11799](https://github.com/PyTorchLightning/pytorch-lightning/pull/11799))

"""
Raises:
MisconfigurationException:
If the TPU device is not available.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
If the TPU device is not available.
If the `torch_xla` is not installed.

MisconfigurationException:
If the TPU device is not available.
"""
if not _XLA_AVAILABLE:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we add another one for TPU_AVAILABLE?
#3104

@mergify mergify bot added the ready PRs ready to be merged label Feb 8, 2022
Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

If the TPU device is not available.
"""
if not _XLA_AVAILABLE:
raise MisconfigurationException("The TPU Accelerator requires torch_xla and a TPU device to run.")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
raise MisconfigurationException("The TPU Accelerator requires torch_xla and a TPU device to run.")
raise MisconfigurationException("The TPU Accelerator requires the `torch_xla` library and `TPU devices` to run.")

@ananthsub
Copy link
Contributor Author

Closing out in favor of the approach in #11797

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accelerator: tpu Tensor Processing Unit ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants