-
Notifications
You must be signed in to change notification settings - Fork 240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update Torch to ONNX export in conformance #2269
Update Torch to ONNX export in conformance #2269
Conversation
Codecov Report
Additional details and impacted files@@ Coverage Diff @@
## develop #2269 +/- ##
===========================================
+ Coverage 90.72% 90.75% +0.03%
===========================================
Files 485 485
Lines 43722 43715 -7
===========================================
+ Hits 39665 39673 +8
+ Misses 4057 4042 -15
Flags with carried forward coverage won't be shown. Click here to find out more. |
Since both representations are possible, shouldn't we instead modify the BC/FBC algos, or NNCFGraph creation from ONNX models to work correctly in both cases? |
Sure |
Conformance job 214. According to results ONNX problems were fixed. Also, some problems on develop was faced for Torch |
Yes, ONNX works, but dont forget to update reference metrics. TORCH backend falls by know issue (ticker 125357) |
Done |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Changes
Do constant folding while exporting to ONNX from Torch
Reason for changes
Conformance test regressgion of ONNX after updating torch to 2.1
Model graphs are updated and contain BatchNorm. Therefore bias locates no more as Conv attribute but in BatchNorm layer. It leads to not applying FBC and BC algorithms to these biases.
Related tickets
125203
Tests
N/A