Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use ONNX Frontend instead of ONNX Reader #7031

Merged
merged 40 commits into from
Aug 26, 2021
Merged
Show file tree
Hide file tree
Changes from 33 commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
9ec27f5
added get_name
Aug 10, 2021
67cfdd4
add support to supported_impl
Aug 11, 2021
cf64e13
remove debug code
Aug 11, 2021
714d444
review remarks
Aug 12, 2021
90cd973
Merge remote-tracking branch 'upstream/master' into mbencer/LoadByModel
Aug 12, 2021
d8a4e22
Merge remote-tracking branch 'upstream/master' into mbencer/LoadByModel
Aug 12, 2021
2129dae
changed name to onnx_experimental
Aug 12, 2021
aa5bb21
Merge remote-tracking branch 'upstream/master' into mbencer/LoadByModel
Aug 12, 2021
9264675
fixed test
Aug 12, 2021
3f5c55e
Merge remote-tracking branch 'upstream/master' into mbencer/LoadByModel
Aug 13, 2021
51b31c7
revert onnx_experimental name
Aug 13, 2021
f16c67a
integrate reader and fe api
Aug 16, 2021
1de22dd
add unit tests
Aug 17, 2021
f4a4bab
removed prototxt from model_validator
Aug 17, 2021
4ca5317
Merge remote-tracking branch 'upstream/master' into mbencer/LoadByModel
Aug 17, 2021
48ad76a
reader refactor
Aug 17, 2021
adfd5ab
Merge remote-tracking branch 'upstream/master' into mbencer/LoadByModel
Aug 18, 2021
c5b90ed
add supress
Aug 18, 2021
b66e61c
Update inference-engine/src/readers/onnx_reader/ie_onnx_reader.cpp
Aug 18, 2021
cad333b
fix segfaults
Aug 18, 2021
563afaa
removed onnx reader
Aug 19, 2021
a8aabaf
handle istringstream
Aug 19, 2021
7eda63e
Merge remote-tracking branch 'upstream/master' into mbencer/LoadByModel
Aug 19, 2021
7e9c2bd
wstring support
Aug 20, 2021
e8b8432
removed saving path
Aug 20, 2021
a7de4c9
styles applied
Aug 20, 2021
8112b7d
changed name to onnx experimental
Aug 20, 2021
b295719
Apply suggestions from code review
Aug 20, 2021
3e80a35
Merge branch 'mbencer/LoadByModel' of github.com:mbencer/openvino int…
Aug 20, 2021
422e757
Merge remote-tracking branch 'upstream/master' into mbencer/LoadByModel
Aug 20, 2021
0f0717c
skip onnx_experimental frontend in mo.py
Aug 20, 2021
7f8c848
add support of wstring paths
Aug 23, 2021
bfa4267
Merge remote-tracking branch 'upstream/master' into mbencer/LoadByModel
Aug 23, 2021
ccfb492
fix wstring ctor of InputModelONNX
Aug 24, 2021
f78d3bc
added NGRAPH_SUPPRESS
Aug 24, 2021
2728690
make one instance of manager
Aug 24, 2021
a93a14b
change onnx_experimental name to onnx
Aug 24, 2021
44b7642
creation frontend manager refactor
Aug 25, 2021
9aab1cb
Merge remote-tracking branch 'upstream/master' into mbencer/LoadByModel
Aug 25, 2021
3f4edbf
Merge remote-tracking branch 'upstream/master' into mbencer/LoadByModel
Aug 25, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions cmake/developer_package/plugins/plugins.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -112,8 +112,8 @@ function(ie_add_plugin)
if(TARGET inference_engine_ir_v7_reader)
add_dependencies(${IE_PLUGIN_NAME} inference_engine_ir_v7_reader)
endif()
if(TARGET inference_engine_onnx_reader)
add_dependencies(${IE_PLUGIN_NAME} inference_engine_onnx_reader)
if(TARGET onnx_ngraph_frontend)
add_dependencies(${IE_PLUGIN_NAME} onnx_ngraph_frontend)
endif()

# install rules
Expand Down
4 changes: 2 additions & 2 deletions docs/IE_DG/Deep_Learning_Inference_Engine_DevGuide.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,10 +43,10 @@ This library contains the classes to:
Starting from 2020.4 release, Inference Engine introduced a concept of `CNNNetwork` reader plugins. Such plugins can be automatically dynamically loaded by Inference Engine in runtime depending on file format:
* Linux* OS:
- `libinference_engine_ir_reader.so` to read a network from IR
- `libinference_engine_onnx_reader.so` to read a network from ONNX model format
- `onnx_ngraph_frontend.so` to read a network from ONNX model format
* Windows* OS:
- `inference_engine_ir_reader.dll` to read a network from IR
- `inference_engine_onnx_reader.dll` to read a network from ONNX model format
- `onnx_ngraph_frontend.dll` to read a network from ONNX model format

### Device-Specific Plugin Libraries

Expand Down
4 changes: 2 additions & 2 deletions docs/IE_DG/inference_engine_intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,10 +46,10 @@ This library contains the classes to:
Starting from 2020.4 release, Inference Engine introduced a concept of `CNNNetwork` reader plugins. Such plugins can be automatically dynamically loaded by Inference Engine in runtime depending on file format:
* Unix* OS:
- `libinference_engine_ir_reader.so` to read a network from IR
- `libinference_engine_onnx_reader.so` to read a network from ONNX model format
- `onnx_ngraph_frontend.so` to read a network from ONNX model format
* Windows* OS:
- `inference_engine_ir_reader.dll` to read a network from IR
- `inference_engine_onnx_reader.dll` to read a network from ONNX model format
- `onnx_ngraph_frontend.dll` to read a network from ONNX model format

### Device-specific Plugin Libraries ###

Expand Down
2 changes: 1 addition & 1 deletion inference-engine/src/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,5 +53,5 @@ add_custom_target(ie_libraries ALL
inference_engine_lp_transformations inference_engine_snippets)

if(NGRAPH_ONNX_FRONTEND_ENABLE)
add_dependencies(ie_libraries inference_engine_onnx_reader)
add_dependencies(ie_libraries onnx_ngraph_frontend)
endif()
27 changes: 14 additions & 13 deletions inference-engine/src/inference_engine/src/ie_network_reader.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -115,14 +115,6 @@ void registerReaders() {
return std::make_shared<Reader>(name, library_name);
};

// try to load ONNX reader if library exists
auto onnxReader =
create_if_exists("ONNX", std::string("inference_engine_onnx_reader") + std::string(IE_BUILD_POSTFIX));
if (onnxReader) {
readers.emplace("onnx", onnxReader);
readers.emplace("prototxt", onnxReader);
}

// try to load IR reader v10 if library exists
auto irReaderv10 =
create_if_exists("IRv10", std::string("inference_engine_ir_reader") + std::string(IE_BUILD_POSTFIX));
Expand Down Expand Up @@ -174,10 +166,6 @@ CNNNetwork details::ReadNetwork(const std::string& modelPath,
#endif
// Try to open model file
std::ifstream modelStream(model_path, std::ios::binary);
// save path in extensible array of stream
// notice: lifetime of path pointed by pword(0) is limited by current scope
const std::string path_to_save_in_stream = modelPath;
modelStream.pword(0) = const_cast<char*>(path_to_save_in_stream.c_str());
if (!modelStream.is_open())
IE_THROW() << "Model file " << modelPath << " cannot be opened!";

Expand Down Expand Up @@ -259,7 +247,7 @@ CNNNetwork details::ReadNetwork(const std::string& modelPath,
}
if (inputModel) {
auto ngFunc = FE->convert(inputModel);
return CNNNetwork(ngFunc);
return CNNNetwork(ngFunc, exts);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need this change?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we have a scenario

  • add extension
  • read an ONNX model
  • serialize it to xml
  • read the model again as xml
    Passing exts is needed. It is tested by CustomOpUser_ONNXImporter unit test

}
IE_THROW() << "Unknown model format! Cannot find reader for model format: " << fileExt
<< " and read the model: " << modelPath << ". Please check that reader library exists in your PATH.";
Expand All @@ -282,6 +270,19 @@ CNNNetwork details::ReadNetwork(const std::string& model,
return reader->read(modelStream, exts);
}
}
// Try to load with FrontEndManager
// NOTE: weights argument is ignored
static ngraph::frontend::FrontEndManager manager;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we use one instance of FrontEndManager for core object?
Now you have 2 static instances

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

ngraph::frontend::FrontEnd::Ptr FE;
ngraph::frontend::InputModel::Ptr inputModel;
FE = manager.load_by_model(&modelStream);
if (FE)
inputModel = FE->load(&modelStream);
if (inputModel) {
auto ngFunc = FE->convert(inputModel);
return CNNNetwork(ngFunc, exts);
}

IE_THROW() << "Unknown model format! Cannot find reader for the model and read it. Please check that reader "
"library exists in your PATH.";
}
Expand Down
4 changes: 0 additions & 4 deletions inference-engine/src/readers/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,3 @@ add_cpplint_target(${TARGET_NAME}_cpplint FOR_SOURCES ${reader_api_hpp})

add_subdirectory(ir_reader)
add_subdirectory(ir_reader_v7)

if(NGRAPH_ONNX_FRONTEND_ENABLE)
add_subdirectory(onnx_reader)
endif()
39 changes: 0 additions & 39 deletions inference-engine/src/readers/onnx_reader/CMakeLists.txt

This file was deleted.

70 changes: 0 additions & 70 deletions inference-engine/src/readers/onnx_reader/ie_onnx_reader.cpp

This file was deleted.

45 changes: 0 additions & 45 deletions inference-engine/src/readers/onnx_reader/ie_onnx_reader.hpp

This file was deleted.

Loading