-
Notifications
You must be signed in to change notification settings - Fork 240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix NNCF compatibility with torch.compile #2665
Fix NNCF compatibility with torch.compile #2665
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## develop #2665 +/- ##
============================================
- Coverage 46.85% 34.78% -12.07%
============================================
Files 493 495 +2
Lines 45551 46007 +456
============================================
- Hits 21341 16003 -5338
- Misses 24210 30004 +5794
... and 113 files with indirect coverage changes
Flags with carried forward coverage won't be shown. Click here to find out more.
|
The torch nightly build 207 has failed |
@nikita-savelyevv, nice PR! One question: why the context was a thread-local in the first place? |
Thanks! That's a good question. @vshampor could you please comment on that? |
torch nightly build 234 is green, PR is ready for review |
@AlexanderDokuchaev @daniil-lyakhov @vshampor please take a look |
@AlexanderDokuchaev thank you for your suggestions! Applied all except one |
### Changes - Add condition to skip operator wrapping if torch is in unpatched state - Fix test from a previous PR #2665 (remove dependency of the test on `nncf.torch`) ### Reason for changes Fix inference of compiled torch models when `nncf.torch` is imported ### Related tickets 140265 ### Tests Added `test_operator_unpatching`
### Changes Check whether model is compiled based on `"_torchdynamo_orig_callable"` property of the model forward. ### Reason for changes `torch.compile` can be applied not only to the model itself, but to `forward()` method only. For example: ``` model.forward = torch.compile(model.forward) ``` In this case the model itself doesn't change and it won't be an instance of `torch._dynamo.OptimizedModule`. ### Related tickets 143796 ### Tests Added test when `torch.compile` is applied this way. It does not fail without the fix though, because the issue is sporadic. ### Relates to #2665, #2719
Changes
torch.compile()
call for vanilla PyTorch modeltorch.compile()
is called for NNCF-optimized modelReason for changes
PyTorch dynamo compilation conflicts with nncf patching of PyTorch. This results in errors during compiled model forward, even if the model was not quantized, i.e. just import of
nncf.torch
results in failure.Related tickets
140265
Tests
Added test for
torch.compile
compatibility withnncf