Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Conditions of Getting min-max during TF MatMul Requantize #1660

Merged
merged 1 commit into from
Mar 12, 2024

Conversation

zehao-intel
Copy link
Contributor

Type of Change

bug fix

Description

Update conditions of getting min-max during TF MatMul requantize to fix a bug found when quantizing urlnet.

How has this PR been tested?

UT

Dependency Change?

No

Copy link

github-actions bot commented Mar 8, 2024

⚡ Required checks status: All passing 🟢

Groups summary

🟢 Code Scan Tests workflow
Check ID Status Error details
Code-Scan success
Code-Scan (Bandit Code Scan Bandit) success
Code-Scan (DocStyle Code Scan DocStyle) success
Code-Scan (Pylint Code Scan Pylint) success

These checks are required after the changes to neural_compressor/adaptor/tf_utils/graph_rewriter/int8/fuse_matmul_requantize.py.

🟢 Model Tests workflow
Check ID Status Error details
Model-Test success
Model-Test (Generate Report GenerateReport) success
Model-Test (Run MXNet Model resnet50v1) success
Model-Test (Run ONNX Model resnet50-v1-12) success
Model-Test (Run PyTorch Model resnet18) success
Model-Test (Run PyTorch Model resnet18_fx) success
Model-Test (Run TensorFlow Model darknet19) success
Model-Test (Run TensorFlow Model densenet-121) success
Model-Test (Run TensorFlow Model inception_v1) success
Model-Test (Run TensorFlow Model resnet-101) success
Model-Test (Run TensorFlow Model resnet50v1.5) success
Model-Test (Run TensorFlow Model ssd_mobilenet_v1_ckpt) success
Model-Test (Run TensorFlow Model ssd_resnet50_v1) success

These checks are required after the changes to neural_compressor/adaptor/tf_utils/graph_rewriter/int8/fuse_matmul_requantize.py.

🟢 Unit Tests basic workflow
Check ID Status Error details
UT-Basic success
UT-Basic (Coverage Compare CollectDatafiles) success
UT-Basic (Unit Test FWKs adaptor Test FWKs adaptor) success
UT-Basic (Unit Test FWKs adaptor baseline Test FWKs adaptor baseline) success
UT-Basic (Unit Test ITEX Test ITEX) success
UT-Basic (Unit Test ITEX baseline Test ITEX baseline) success
UT-Basic (Unit Test Pruning Test PyTorch Pruning) success
UT-Basic (Unit Test Pruning Test TensorFlow Pruning) success
UT-Basic (Unit Test Pruning baseline Test PyTorch Pruning baseline) success
UT-Basic (Unit Test Pruning baseline Test TensorFlow Pruning baseline) success
UT-Basic (Unit Test TF newAPI Test TF newAPI) success
UT-Basic (Unit Test TF newAPI baseline Test TF newAPI baseline) success
UT-Basic (Unit Test User facing API Test User facing API) success
UT-Basic (Unit Test User facing API baseline Test User facing API baseline) success
UT-Basic (Unit Test other basic case Test other basic case) success
UT-Basic (Unit Test other cases baseline Test other cases baseline) success
UT-Basic coverage report
Base coverage PR coverage Diff
Lines 86.980% 86.983% 0.003%
Branches 76.337% 76.341% 0.004%

These checks are required after the changes to neural_compressor/adaptor/tf_utils/graph_rewriter/int8/fuse_matmul_requantize.py.

🟢 Unit Tests basic no coverage workflow
Check ID Status Error details
UT-Basic-No-Coverage success
UT-Basic-No-Coverage (Unit Test FWKs adaptor Test FWKs adaptor) success
UT-Basic-No-Coverage (Unit Test Pruning Test PyTorch Pruning) success
UT-Basic-No-Coverage (Unit Test Pruning Test TensorFlow Pruning) success
UT-Basic-No-Coverage (Unit Test User facing API Test User facing API) success
UT-Basic-No-Coverage (Unit Test other basic case Test other basic case) success

These checks are required after the changes to neural_compressor/adaptor/tf_utils/graph_rewriter/int8/fuse_matmul_requantize.py.

🟢 Unit Tests ITREX workflow
Check ID Status Error details
UT-ITREX success

These checks are required after the changes to neural_compressor/adaptor/tf_utils/graph_rewriter/int8/fuse_matmul_requantize.py.


Thank you for your contribution! 💜

Note
This comment is automatically generated and updates for 360 minutes every 180 seconds. If you have any other questions, contact chensuyue or XuehaoSun for help.

@chensuyue chensuyue merged commit d07175c into master Mar 12, 2024
51 checks passed
@chensuyue chensuyue deleted the zehao/matmul_req branch March 12, 2024 06:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants