Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[LPT] MoveFakeQuantize #6723

Merged
merged 40 commits into from
Sep 15, 2021

Conversation

ndemashov
Copy link
Contributor

@ndemashov ndemashov commented Jul 20, 2021

Details:

  • MoveFakeQuantizeTransformation
  • Functional Tests
  • Plugin Tests

Tickets:

  • 53934
  • Validation: 62106

@ndemashov ndemashov requested review from a team July 20, 2021 14:14
@openvino-pushbot openvino-pushbot added ExternalPR External contributor category: CPU OpenVINO CPU plugin category: IE Tests OpenVINO Test: plugins and common labels Jul 20, 2021
@ndemashov ndemashov requested a review from a team July 29, 2021 12:17
@ndemashov ndemashov requested a review from a team as a code owner July 29, 2021 12:17
@ndemashov ndemashov requested a review from a team July 29, 2021 12:17
@ndemashov ndemashov requested review from a team as code owners July 29, 2021 12:17
@ndemashov ndemashov requested review from a team July 29, 2021 12:17
@ndemashov ndemashov requested a review from a team as a code owner July 29, 2021 12:17
@ndemashov ndemashov requested review from a team July 29, 2021 12:17
@ndemashov ndemashov requested a review from a team as a code owner July 29, 2021 12:17
@ndemashov ndemashov requested review from a team July 29, 2021 12:17
@ilyachur
Copy link
Contributor

Looks like you have a lot of redundant changes in this PR.

@eshoguli eshoguli changed the title MoveFakeQuantize [WIP] [LPT] MoveFakeQuantize Aug 3, 2021
@ndemashov ndemashov force-pushed the nd/lpt/move_fake_quantize branch from 8654a22 to dbb72f8 Compare August 3, 2021 10:20
@ndemashov ndemashov force-pushed the nd/lpt/move_fake_quantize branch from 35128ff to bdfed62 Compare August 4, 2021 23:34
@ndemashov ndemashov force-pushed the nd/lpt/move_fake_quantize branch from a81accf to 50d20ef Compare September 5, 2021 16:52
Copy link
Contributor

@eshoguli eshoguli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please fix minor comments in D/DQ support PR


#include "shared_test_classes/base/low_precision_transformations/layer_transformation.hpp"
#include "lpt_ngraph_functions/common/fake_quantize_on_data.hpp"
#include "lpt_ngraph_functions/common/fake_quantize_on_weights.hpp"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove

Comment on lines +46 to +75
if (!fqOnData1.empty()) {
if (operation == "relu") {
auto relu1 = std::make_shared<ngraph::opset1::Relu>(input1->output(0));
parent1 = makeFakeQuantize(relu1, inputPrecision, fqOnData1);
} else {
parent1 = makeFakeQuantize(input1, inputPrecision, fqOnData1);
}
parent1->set_friendly_name("concat_fq1");
if (!convert1.empty()) {
parent1 = std::make_shared<opset1::Convert>(parent1, convert1.outPrecision);
}
if (!dequantization1.empty()) {
parent1 = makeDequantization(parent1, dequantization1);
}
}
if (!fqOnData2.empty()) {
if (operation == "relu") {
auto relu2 = std::make_shared<ngraph::opset1::Relu>(input2->output(0));
parent2 = makeFakeQuantize(relu2, inputPrecision, fqOnData2);
} else {
parent2 = makeFakeQuantize(input1, inputPrecision, fqOnData2);
}
parent2->set_friendly_name("concat_fq2");
if (!convert2.empty()) {
parent1 = std::make_shared<opset1::Convert>(parent2, convert2.outPrecision);
}
if (!dequantization1.empty()) {
parent2 = makeDequantization(parent2, dequantization2);
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

minor: copy/paste

Comment on lines +85 to +95
if (!fqOnData3.empty()) {
std::shared_ptr<Node> fq;
if (operation == "relu") {
auto relu = std::make_shared<ngraph::opset1::Relu>(concat->output(0));
fq = makeFakeQuantize(relu, inputPrecision, fqOnData3);
} else {
fq = makeFakeQuantize(concat, inputPrecision, fqOnData3);
}
fq->set_friendly_name("fakeQuantizeAfter");
parent = fq;
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

minor:

if (operation == "relu") {
   ...
}

if (!fqOnData1.empty()) {
   ...
}

@eshoguli eshoguli merged commit 5b285ed into openvinotoolkit:master Sep 15, 2021
akuporos pushed a commit to akuporos/openvino that referenced this pull request Sep 29, 2021
* add move_fake_quantize_for_concat_transformation, mfk and mfk_function

* fix relu_transformation.cpp

* backup

* add change

* add cpu test

* [LPT] MoveFakeQuantizeTransformation: fixes

* get InferenceEngine::NotImplemented

* fix ieFuncTests

* try without new cpu_test

* fix cpuFuncTests and ieFuncTests

* fix tests

* fix lin

* add cpu test

* fix link and matcher in move_fake_quantize.cpp

* update matcher

* add gpu test

* naming fix

* move_fake_quantize.cpp add set_fr_name for new_concat

* naming new fq fix

* fix NetworkHelper::copyInfo naming

* concat.cpp naming fix

* gpu tests fix

* rm network_helper changes

* rm extra output

* resolve conversations

* resolve other conversations

* add multi inputs for concat

* fix lin

* fix move_fake_qunatize naming

* rm maxpool from mfk_function

* mkldnn update

* fix style

* rm extra change

* fix concat matcher

* rm mkldnn_plugin changes

* fix conversations

* fix interval

* fix and add isQuantizedStatic, add attribute and negative tests

* add negative plugin tests

* fix style:

Co-authored-by: Edward Shogulin <[email protected]>
@AnastasiaKazantaeva AnastasiaKazantaeva removed the ExternalPR External contributor label Jan 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: IE Tests OpenVINO Test: plugins and common category: LP transformations OpenVINO Low Precision transformations
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants