Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rework model loading in FE manager, implement PDPD probing #6358

Merged
merged 52 commits into from
Jul 19, 2021
Merged
Show file tree
Hide file tree
Changes from 51 commits
Commits
Show all changes
52 commits
Select commit Hold shift + click to select a range
236baa3
Rework model loading in FE manager, implement PDPD probing
mvafin May 18, 2021
6ea6cac
Fix build
mvafin Jun 24, 2021
e2aa495
Fix build
mvafin Jun 28, 2021
d68f9b4
Merge remote-tracking branch 'remotes/upstream/master'
mvafin Jun 30, 2021
5363588
Fix build
mvafin Jun 30, 2021
c067b50
Fix unicode
mvafin Jun 29, 2021
de82907
Fix merge issues
mvafin Jun 30, 2021
e57d40f
Fix codestyle
mvafin Jun 30, 2021
38b0503
Read frontends path from frontend_manager library location
mvafin Jun 29, 2021
5561471
Fix codestyle
mvafin Jul 1, 2021
79d3bcb
Fix FE dependency
mvafin Jul 1, 2021
7880e77
Fix dependencies
mvafin Jul 1, 2021
9dd11e2
Fix codestyle
mvafin Jul 1, 2021
9fd49cd
Check if model file exists
mvafin Jul 1, 2021
95f06e2
Revert adding model to lfs
mvafin Jul 1, 2021
835faae
Add test model
mvafin Jul 1, 2021
54db97e
Fix cmake dependencies
mvafin Jul 2, 2021
b375fe3
Apply review feedback
mvafin Jul 2, 2021
8b729dd
Merge remote-tracking branch 'remotes/upstream/master' into frontend/…
mvafin Jul 2, 2021
8783067
Revert pugixml
mvafin Jul 2, 2021
88ef12c
make getFrontendLibraryPath not public API
mvafin Jul 2, 2021
28fe13e
Fix codestyle
mvafin Jul 2, 2021
3ba2319
Apply fix from Ilya Lavrenov
mvafin Jul 2, 2021
4f51777
Add FE dependency in legacy tests
mvafin Jul 5, 2021
0d917d5
Merge remote-tracking branch 'remotes/upstream/master' into frontend/…
mvafin Jul 5, 2021
ddfc5d6
Remove not needed dependency
mvafin Jul 5, 2021
0d09e9a
Better support Unicode
mvafin Jul 5, 2021
bc57b9c
Fix build
mvafin Jul 5, 2021
8635500
Fix build
mvafin Jul 5, 2021
9701b19
Merge remote-tracking branch 'remotes/upstream/master' into frontend/…
mvafin Jul 5, 2021
32660f0
Merge remote-tracking branch 'remotes/upstream/master' into frontend/…
mvafin Jul 6, 2021
9957254
Merge remote-tracking branch 'remotes/upstream/master' into frontend/…
mvafin Jul 6, 2021
17bafbb
Fix build
mvafin Jul 7, 2021
2d18db4
Merge remote-tracking branch 'remotes/upstream/master' into frontend/…
mvafin Jul 7, 2021
544e9ff
Add dependency foe deprecated tests
mvafin Jul 7, 2021
6317c59
Fix dependency
mvafin Jul 7, 2021
481e2ad
Fix typo
mvafin Jul 7, 2021
198d6eb
Revert adding FE dependency to IESharedTests
mvafin Jul 8, 2021
72c79aa
Remove relative paths from frontend unit tests
mvafin Jul 8, 2021
45ef110
Merge remote-tracking branch 'remotes/upstream/master' into frontend/…
mvafin Jul 8, 2021
f212a0a
Apply review feedback
mvafin Jul 9, 2021
a997f67
Fix typo
mvafin Jul 9, 2021
ceb8184
Return allow-undefined, since kmb dependecies fail to link
mvafin Jul 12, 2021
49bf172
Merge remote-tracking branch 'remotes/upstream/master' into frontend/…
mvafin Jul 12, 2021
29d2367
Fix merge conflict
mvafin Jul 12, 2021
ebf8b75
Compare functions in reader tests
mvafin Jul 13, 2021
03e31f3
Simplify code to load from variants
mvafin Jul 14, 2021
061ed5e
Fix codestyle
mvafin Jul 15, 2021
50b200f
Fix build
mvafin Jul 15, 2021
3012de8
Compare names in reader tests
mvafin Jul 16, 2021
8465af5
Fix wchar in variant
mvafin Jul 16, 2021
b2b11e7
Merge branch 'master' into frontend/reader_master
ilyachur Jul 19, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions cmake/templates/InferenceEngineConfig.cmake.in
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,10 @@ function(_ie_target_no_deprecation_error)
else()
set(flags "-Wno-error=deprecated-declarations")
endif()
if(CMAKE_CROSSCOMPILING)
mvafin marked this conversation as resolved.
Show resolved Hide resolved
set_target_properties(${ARGV} PROPERTIES
INTERFACE_LINK_OPTIONS "-Wl,--allow-shlib-undefined")
ilyachur marked this conversation as resolved.
Show resolved Hide resolved
endif()

set_target_properties(${ARGV} PROPERTIES INTERFACE_COMPILE_OPTIONS ${flags})
endif()
Expand Down
5 changes: 3 additions & 2 deletions inference-engine/src/inference_engine/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -124,6 +124,7 @@ target_compile_definitions(${TARGET_NAME}_obj PRIVATE IMPLEMENT_INFERENCE_ENGINE

target_include_directories(${TARGET_NAME}_obj SYSTEM PRIVATE $<TARGET_PROPERTY:ngraph::ngraph,INTERFACE_INCLUDE_DIRECTORIES>
$<TARGET_PROPERTY:pugixml::static,INTERFACE_INCLUDE_DIRECTORIES>
$<TARGET_PROPERTY:ngraph::frontend_manager,INTERFACE_INCLUDE_DIRECTORIES>
$<TARGET_PROPERTY:xbyak,INTERFACE_INCLUDE_DIRECTORIES>)

target_include_directories(${TARGET_NAME}_obj PRIVATE "${CMAKE_CURRENT_SOURCE_DIR}"
Expand Down Expand Up @@ -160,7 +161,7 @@ if (TBBBIND_2_4_FOUND)
endif()

target_link_libraries(${TARGET_NAME} PRIVATE pugixml::static openvino::itt ${CMAKE_DL_LIBS} Threads::Threads
ngraph inference_engine_transformations)
ngraph ngraph::frontend_manager inference_engine_transformations)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe we need to make static linkage for frontend_manager inside IE?

CC @ilyachur @slyalin

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't have much preferences here. But if user want to directly instantiate FEs, FEM will be used directly and it may result into two FEMs linked (one statically and another one dynamically). Is it a big deal?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Having static and dynamic versions of FEM is not a problem, IE will not expose any FEM symbols and they will not be any conflicts.

I just wonder that it's 100 KB library and it cannot be directly injected into IE instead of adding one more dependency...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If that is considered important, lets do this as a separate PR


target_include_directories(${TARGET_NAME} INTERFACE
$<BUILD_INTERFACE:${PUBLIC_HEADERS_DIR}>
Expand Down Expand Up @@ -200,7 +201,7 @@ if(WIN32)
set_target_properties(${TARGET_NAME}_s PROPERTIES COMPILE_PDB_NAME ${TARGET_NAME}_s)
endif()

target_link_libraries(${TARGET_NAME}_s PRIVATE openvino::itt ${CMAKE_DL_LIBS} ngraph
target_link_libraries(${TARGET_NAME}_s PRIVATE openvino::itt ${CMAKE_DL_LIBS} ngraph ngraph::frontend_manager
inference_engine_transformations pugixml::static)

target_compile_definitions(${TARGET_NAME}_s PUBLIC USE_STATIC_IE)
Expand Down
23 changes: 22 additions & 1 deletion inference-engine/src/inference_engine/ie_network_reader.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
#include <file_utils.h>
#include <ie_reader.hpp>
#include <ie_ir_version.hpp>
#include <frontend_manager/frontend_manager.hpp>

#include <fstream>
#include <istream>
Expand Down Expand Up @@ -226,6 +227,26 @@ CNNNetwork details::ReadNetwork(const std::string& modelPath, const std::string&
return reader->read(modelStream, exts);
}
}
// Try to load with FrontEndManager
static ngraph::frontend::FrontEndManager manager;
ngraph::frontend::FrontEnd::Ptr FE;
ngraph::frontend::InputModel::Ptr inputModel;
if (!binPath.empty()) {
#if defined(ENABLE_UNICODE_PATH_SUPPORT) && defined(_WIN32)
std::wstring weights_path = FileUtils::multiByteCharToWString(binPath.c_str());
mvafin marked this conversation as resolved.
Show resolved Hide resolved
#else
std::string weights_path = binPath;
#endif
FE = manager.load_by_model(model_path, weights_path);
ilyachur marked this conversation as resolved.
Show resolved Hide resolved
if (FE) inputModel = FE->load(model_path, weights_path);
mvafin marked this conversation as resolved.
Show resolved Hide resolved
} else {
FE = manager.load_by_model(model_path);
if (FE) inputModel = FE->load(model_path);
}
if (inputModel) {
auto ngFunc = FE->convert(inputModel);
return CNNNetwork(ngFunc);
}
IE_THROW() << "Unknown model format! Cannot find reader for model format: " << fileExt << " and read the model: " << modelPath <<
". Please check that reader library exists in your PATH.";
}
Expand All @@ -248,4 +269,4 @@ CNNNetwork details::ReadNetwork(const std::string& model, const Blob::CPtr& weig
IE_THROW() << "Unknown model format! Cannot find reader for the model and read it. Please check that reader library exists in your PATH.";
}

} // namespace InferenceEngine
} // namespace InferenceEngine
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,11 @@ if(NGRAPH_ONNX_IMPORT_ENABLE)
add_dependencies(${TARGET_NAME} inference_engine_onnx_reader)
endif()

if(NGRAPH_PDPD_FRONTEND_ENABLE)
target_compile_definitions(${TARGET_NAME} PRIVATE
mvafin marked this conversation as resolved.
Show resolved Hide resolved
PDPD_TEST_MODELS="${CMAKE_CURRENT_SOURCE_DIR}/pdpd_reader/models/")
endif()

ie_faster_build(${TARGET_NAME}
PCH PRIVATE "precomp.hpp"
)
Expand Down
Binary file not shown.
Binary file not shown.
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
// Copyright (C) 2018-2021 Intel Corporation
// SPDX-License-Identifier: Apache-2.0
//

#include <gtest/gtest.h>
#include <set>
#include <string>
#include <fstream>

#include <ie_blob.h>
#include <ie_core.hpp>
#include <file_utils.h>
#include <ngraph/ngraph.hpp>
#include <ngraph/opsets/opset8.hpp>
#include "common_test_utils/ngraph_test_utils.hpp"

TEST(PDPD_Reader_Tests, ImportBasicModelToCore) {
auto model = std::string(PDPD_TEST_MODELS) + "relu.pdmodel";
mvafin marked this conversation as resolved.
Show resolved Hide resolved
InferenceEngine::Core ie;
auto cnnNetwork = ie.ReadNetwork(model);
auto function = cnnNetwork.getFunction();

const auto inputType = ngraph::element::f32;
const auto inputShape = ngraph::Shape{ 3 };

const auto data = std::make_shared<ngraph::opset8::Parameter>(inputType, inputShape);
data->set_friendly_name("x");
data->output(0).get_tensor().add_names({ "x" });
const auto relu = std::make_shared<ngraph::opset8::Relu>(data->output(0));
relu->set_friendly_name("relu_0.tmp_0");
relu->output(0).get_tensor().add_names({ "relu_0.tmp_0" });
const auto scale = std::make_shared<ngraph::opset8::Constant>(ngraph::element::f32, ngraph::Shape{ 1 }, std::vector<float>{1});
const auto bias = std::make_shared<ngraph::opset8::Constant>(ngraph::element::f32, ngraph::Shape{ 1 }, std::vector<float>{0});
const auto node_multiply = std::make_shared<ngraph::opset8::Multiply>(relu->output(0), scale);
const auto node_add = std::make_shared<ngraph::opset8::Add>(node_multiply, bias);
node_add->set_friendly_name("save_infer_model/scale_0.tmp_1");
node_add->output(0).get_tensor().add_names({ "save_infer_model/scale_0.tmp_1" });
const auto result = std::make_shared<ngraph::opset8::Result>(node_add->output(0));
result->set_friendly_name("save_infer_model/scale_0.tmp_1/Result");
const auto reference = std::make_shared<ngraph::Function>(
ngraph::NodeVector{ result },
ngraph::ParameterVector{ data },
"RefPDPDFunction");
const FunctionsComparator func_comparator = FunctionsComparator::with_default().enable(FunctionsComparator::NAMES);
const FunctionsComparator::Result res = func_comparator(function, reference);
ASSERT_TRUE(res.valid);
}

#if defined(ENABLE_UNICODE_PATH_SUPPORT) && defined(_WIN32)
TEST(PDPD_Reader_Tests, ImportBasicModelToCoreWstring) {
std::string win_dir_path{ PDPD_TEST_MODELS };
std::replace(win_dir_path.begin(), win_dir_path.end(), '/', '\\');
const std::wstring unicode_win_dir_path = FileUtils::multiByteCharToWString(win_dir_path.c_str());
auto model = unicode_win_dir_path + L"ひらがな日本語.pdmodel";
InferenceEngine::Core ie;
auto cnnNetwork = ie.ReadNetwork(model);
auto function = cnnNetwork.getFunction();

const auto inputType = ngraph::element::f32;
const auto inputShape = ngraph::Shape{ 3 };

const auto data = std::make_shared<ngraph::opset8::Parameter>(inputType, inputShape);
data->set_friendly_name("x");
data->output(0).get_tensor().add_names({ "x" });
const auto relu = std::make_shared<ngraph::opset8::Relu>(data->output(0));
relu->set_friendly_name("relu_0.tmp_0");
relu->output(0).get_tensor().add_names({ "relu_0.tmp_0" });
const auto scale = std::make_shared<ngraph::opset8::Constant>(ngraph::element::f32, ngraph::Shape{ 1 }, std::vector<float>{1});
const auto bias = std::make_shared<ngraph::opset8::Constant>(ngraph::element::f32, ngraph::Shape{ 1 }, std::vector<float>{0});
const auto node_multiply = std::make_shared<ngraph::opset8::Multiply>(relu->output(0), scale);
const auto node_add = std::make_shared<ngraph::opset8::Add>(node_multiply, bias);
node_add->set_friendly_name("save_infer_model/scale_0.tmp_1");
node_add->output(0).get_tensor().add_names({ "save_infer_model/scale_0.tmp_1" });
const auto result = std::make_shared<ngraph::opset8::Result>(node_add->output(0));
result->set_friendly_name("save_infer_model/scale_0.tmp_1/Result");
const auto reference = std::make_shared<ngraph::Function>(
ngraph::NodeVector{ result },
ngraph::ParameterVector{ data },
"RefPDPDFunction");
const FunctionsComparator func_comparator = FunctionsComparator::with_default().enable(FunctionsComparator::NAMES);
const FunctionsComparator::Result res = func_comparator(function, reference);
ASSERT_TRUE(res.valid);
}
#endif
2 changes: 1 addition & 1 deletion model-optimizer/mo/moc_frontend/pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ def moc_pipeline(argv: argparse.Namespace):
str(fem.get_available_front_ends())))
log.debug('Initializing new FE for framework {}'.format(argv.framework))
fe = fem.load_by_framework(argv.framework)
input_model = fe.load_from_file(argv.input_model)
input_model = fe.load(argv.input_model)

mvafin marked this conversation as resolved.
Show resolved Hide resolved
user_shapes, outputs, freeze_placeholder = fe_user_data_repack(
input_model, argv.placeholder_shapes, argv.placeholder_data_types,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ extern "C" MOCK_API void* GetFrontEndData()
{
FrontEndPluginInfo* res = new FrontEndPluginInfo();
res->m_name = "mock_mo_ngraph_frontend";
res->m_creator = [](FrontEndCapFlags flags) { return std::make_shared<FrontEndMockPy>(flags); };
res->m_creator = []() { return std::make_shared<FrontEndMockPy>(); };

return res;
}
Original file line number Diff line number Diff line change
Expand Up @@ -292,11 +292,9 @@ class MOCK_API InputModelMockPy : public InputModel
/// was called with correct arguments during test execution
struct MOCK_API FeStat
{
FrontEndCapFlags m_load_flags;
std::vector<std::string> m_load_paths;
int m_convert_model = 0;
// Getters
FrontEndCapFlags load_flags() const { return m_load_flags; }
std::vector<std::string> load_paths() const { return m_load_paths; }
int convert_model() const { return m_convert_model; }
};
Expand All @@ -309,13 +307,8 @@ class MOCK_API FrontEndMockPy : public FrontEnd
static FeStat m_stat;

public:
FrontEndMockPy(FrontEndCapFlags flags) { m_stat.m_load_flags = flags; }
FrontEndMockPy() {}

InputModel::Ptr load_from_file(const std::string& path) const override
{
m_stat.m_load_paths.push_back(path);
return std::make_shared<InputModelMockPy>();
}

std::shared_ptr<ngraph::Function> convert(InputModel::Ptr model) const override
{
Expand All @@ -326,4 +319,15 @@ class MOCK_API FrontEndMockPy : public FrontEnd
static FeStat get_stat() { return m_stat; }

static void clear_stat() { m_stat = {}; }

protected:
InputModel::Ptr load_impl(const std::vector<std::shared_ptr<Variant>>& params) const override
{
if (params.size() > 0 && is_type<VariantWrapper<std::string>>(params[0]))
{
auto path = as_type_ptr<VariantWrapper<std::string>>(params[0])->get();
m_stat.m_load_paths.push_back(path);
}
return std::make_shared<InputModelMockPy>();
}
};
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@ static void register_mock_frontend_stat(py::module m)
m.def("clear_frontend_statistic", &FrontEndMockPy::clear_stat);

py::class_<FeStat> feStat(m, "FeStat", py::dynamic_attr());
feStat.def_property_readonly("load_flags", &FeStat::load_flags);
feStat.def_property_readonly("load_paths", &FeStat::load_paths);
feStat.def_property_readonly("convert_model", &FeStat::convert_model);
}
Expand Down
23 changes: 23 additions & 0 deletions ngraph/core/include/ngraph/variant.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -75,4 +75,27 @@ namespace ngraph
{
}
};

template <typename T>
inline std::shared_ptr<Variant> make_variant(const T& p)
{
return std::dynamic_pointer_cast<VariantImpl<T>>(std::make_shared<VariantWrapper<T>>(p));
}

template <size_t N>
inline std::shared_ptr<Variant> make_variant(const char (&s)[N])
{
return std::dynamic_pointer_cast<VariantImpl<std::string>>(
std::make_shared<VariantWrapper<std::string>>(s));
}

#if defined(ENABLE_UNICODE_PATH_SUPPORT) && defined(_WIN32)
template <size_t N>
inline std::shared_ptr<Variant> make_variant(const wchar_t (&s)[N])
{
return std::dynamic_pointer_cast<VariantImpl<std::wstring>>(
std::make_shared<VariantWrapper<std::wstring>>(s));
}
#endif

} // namespace ngraph
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
#include "frontend_manager_defs.hpp"
#include "input_model.hpp"
#include "ngraph/function.hpp"
#include "ngraph/variant.hpp"

namespace ngraph
{
Expand All @@ -26,43 +27,31 @@ namespace ngraph

virtual ~FrontEnd();

/// \brief Loads an input model by specified model file path
/// If model is stored in several files (e.g. model topology and model weights) -
/// frontend implementation is responsible to handle this case, generally frontend may
/// retrieve other file names from main file
/// \param path Main model file path
/// \return Loaded input model
virtual InputModel::Ptr load_from_file(const std::string& path) const;

/// \brief Loads an input model by specified number of model files
/// This shall be used for cases when client knows all model files (model, weights, etc)
/// \param paths Array of model files
/// \return Loaded input model
virtual InputModel::Ptr load_from_files(const std::vector<std::string>& paths) const;

/// \brief Loads an input model by already loaded memory buffer
/// Memory structure is frontend-defined and is not specified in generic API
/// \param model Model memory buffer
/// \return Loaded input model
virtual InputModel::Ptr load_from_memory(const void* model) const;

/// \brief Loads an input model from set of memory buffers
/// Memory structure is frontend-defined and is not specified in generic API
/// \param modelParts Array of model memory buffers
/// \return Loaded input model
virtual InputModel::Ptr
load_from_memory_fragments(const std::vector<const void*>& modelParts) const;

/// \brief Loads an input model by input stream representing main model file
/// \param stream Input stream of main model
/// \return Loaded input model
virtual InputModel::Ptr load_from_stream(std::istream& stream) const;

/// \brief Loads an input model by input streams representing all model files
/// \param streams Array of input streams for model
/// \return Loaded input model
virtual InputModel::Ptr
load_from_streams(const std::vector<std::istream*>& streams) const;
/// \brief Validates if FrontEnd can recognize model with parameters specified.
/// Same parameters should be used to load model.
/// \param vars Any number of parameters of any type. What kind of parameters
/// are accepted is determined by each FrontEnd individually, typically it is
/// std::string containing path to the model file. For more information please
/// refer to specific FrontEnd documentation.
/// \return true if model recognized, false - otherwise.
template <typename... Types>
inline bool supported(const Types&... vars) const
{
return supported_impl({make_variant(vars)...});
ilyachur marked this conversation as resolved.
Show resolved Hide resolved
}

/// \brief Loads an input model by any specified arguments. Each FrontEnd separately
/// defines what arguments it can accept.
/// \param vars Any number of parameters of any type. What kind of parameters
/// are accepted is determined by each FrontEnd individually, typically it is
/// std::string containing path to the model file. For more information please
/// refer to specific FrontEnd documentation.
/// \return Loaded input model.
template <typename... Types>
inline InputModel::Ptr load(const Types&... vars) const
{
return load_impl({make_variant(vars)...});
}

/// \brief Completely convert and normalize entire function, throws if it is not
/// possible
Expand Down Expand Up @@ -95,8 +84,20 @@ namespace ngraph
/// \brief Runs normalization passes on function that was loaded with partial conversion
/// \param function partially converted nGraph function
virtual void normalize(std::shared_ptr<ngraph::Function> function) const;

protected:
virtual bool
supported_impl(const std::vector<std::shared_ptr<Variant>>& variants) const;
virtual InputModel::Ptr
load_impl(const std::vector<std::shared_ptr<Variant>>& variants) const;
};

template <>
inline bool FrontEnd::supported(const std::vector<std::shared_ptr<Variant>>& variants) const
{
return supported_impl(variants);
}

} // namespace frontend

} // namespace ngraph
Loading