Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-GPU training in Vanilla PyTorch Tutorial #7893

Closed
puririshi98 opened this issue Aug 16, 2023 · 0 comments
Closed

Multi-GPU training in Vanilla PyTorch Tutorial #7893

puririshi98 opened this issue Aug 16, 2023 · 0 comments
Assignees

Comments

@puririshi98
Copy link
Contributor

No description provided.

puririshi98 added a commit that referenced this issue Aug 16, 2023
just copypasted another tutorial to have a template, will fill out accordingly throughout the sprint

addresses: #7893
@puririshi98 puririshi98 changed the title [MEDIUM] Multi-GPU training in Vanilla PyTorch: This tutorial should cover the basic of how we can leverage torch.nn.DistributedDataParallel for multi-GPU training in PyG. It should briefly go over the corresponding examples in PyG for distributed batching and distributed sampling. Multi-GPU training in Vanilla PyTorch Aug 16, 2023
@puririshi98 puririshi98 changed the title Multi-GPU training in Vanilla PyTorch Multi-GPU training in Vanilla PyTorch Tutorial Aug 16, 2023
rusty1s added a commit that referenced this issue Sep 20, 2023
addresses: #7893

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: rusty1s <[email protected]>
@rusty1s rusty1s closed this as completed Sep 20, 2023
JakubPietrakIntel pushed a commit that referenced this issue Sep 27, 2023
addresses: #7893

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: rusty1s <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants