Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Profiler patches torch without reverting it #240

Open
awaelchli opened this issue Jul 17, 2024 · 1 comment
Open

Profiler patches torch without reverting it #240

awaelchli opened this issue Jul 17, 2024 · 1 comment
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@awaelchli
Copy link
Contributor

🐛 Bug

LitData patches the worker loop in torch here

from torch.utils.data._utils import worker
worker._worker_loop = _ProfileWorkerLoop(self._loader._profile_batches, self._loader._profile_dir)

but this never gets reverted. Subsequent runs with a dataloader will just happily run this worker loop. This hack should be removed, or if it stays it should have a way for it to remain disabled and be idempotent.

This could lead to hard to debug issues on the user's side.
On our side, this currently leads to interactions in our test suite.

@awaelchli awaelchli added bug Something isn't working help wanted Extra attention is needed labels Jul 17, 2024
@tchaton
Copy link
Collaborator

tchaton commented Jul 17, 2024

You are right !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants