Skip to content

Commit

Permalink
Merge tools and inference_engine/tools folders (openvinotoolkit#7359)
Browse files Browse the repository at this point in the history
* Merge tools folders

* Fixed docs

* Moved deployment_manager

* Fixed path to benchmark_tool docs

* python_tools -> python_tools_benchmark
  • Loading branch information
ilya-lavrenov authored and akuporos committed Sep 29, 2021
1 parent a29e04a commit 1a6aace
Show file tree
Hide file tree
Showing 26 changed files with 37 additions and 38 deletions.
1 change: 0 additions & 1 deletion CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,6 @@ azure-pipelines.yml @openvinotoolkit/openvino-admins
/inference-engine/tests/functional/plugin/myriad/ @openvinotoolkit/openvino-ie-vpu-maintainers @openvinotoolkit/openvino-ie-tests-maintainers
/inference-engine/tests/unit/vpu/ @openvinotoolkit/openvino-ie-vpu-maintainers @openvinotoolkit/openvino-ie-tests-maintainers
/inference-engine/tests/unit/engines/vpu/ @openvinotoolkit/openvino-ie-vpu-maintainers @openvinotoolkit/openvino-ie-tests-maintainers
/inference-engine/tools/vpu/ @openvinotoolkit/openvino-ie-vpu-maintainers
/inference-engine/scripts/run_tests_myriad_multistick.sh @openvinotoolkit/openvino-ie-vpu-maintainers

# IE GNA:
Expand Down
2 changes: 1 addition & 1 deletion docs/IE_DG/Intro_to_Performance.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ Refer to the [Benchmark App](../../inference-engine/samples/benchmark_app/README
## Using Caching API for first inference latency optimization
Since with the 2021.4 release, Inference Engine provides an ability to enable internal caching of loaded networks.
This can significantly reduce load network latency for some devices at application startup.
Internally caching uses plugin's Export/ImportNetwork flow, like it is done for [Compile tool](../../inference-engine/tools/compile_tool/README.md), using the regular ReadNetwork/LoadNetwork API.
Internally caching uses plugin's Export/ImportNetwork flow, like it is done for [Compile tool](../../tools/compile_tool/README.md), using the regular ReadNetwork/LoadNetwork API.
Refer to the [Model Caching Overview](Model_caching_overview.md) for more detailed explanation.

## Using Async API
Expand Down
2 changes: 1 addition & 1 deletion docs/IE_DG/Model_caching_overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ As described in [Inference Engine Developer Guide](Deep_Learning_Inference_Engin

Step #5 can potentially perform several time-consuming device-specific optimizations and network compilations,
and such delays can lead to bad user experience on application startup. To avoid this, some devices offer
Import/Export network capability, and it is possible to either use [Compile tool](../../inference-engine/tools/compile_tool/README.md)
Import/Export network capability, and it is possible to either use [Compile tool](../../tools/compile_tool/README.md)
or enable model caching to export compiled network automatically. Reusing cached networks can significantly reduce load network time.


Expand Down
4 changes: 2 additions & 2 deletions docs/IE_DG/Tools_Overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,8 @@ The OpenVINO™ toolkit installation includes the following tools:
|[Accuracy Checker Tool](@ref omz_tools_accuracy_checker) | `<INSTALL_DIR>/deployment_tools/tools/open_model_zoo/tools/accuracy_checker`|
|[Post-Training Optimization Tool](@ref pot_README) | `<INSTALL_DIR>/deployment_tools/tools/post_training_optimization_toolkit`|
|[Model Downloader](@ref omz_tools_downloader) | `<INSTALL_DIR>/deployment_tools/tools/model_downloader`|
|[Cross Check Tool](../../inference-engine/tools/cross_check_tool/README.md) | `<INSTALL_DIR>/deployment_tools/tools/cross_check_tool`|
|[Compile Tool](../../inference-engine/tools/compile_tool/README.md) | `<INSTALL_DIR>/deployment_tools/inference_engine/lib/intel64/`|
|[Cross Check Tool](../../tools/cross_check_tool/README.md) | `<INSTALL_DIR>/deployment_tools/tools/cross_check_tool`|
|[Compile Tool](../../tools/compile_tool/README.md) | `<INSTALL_DIR>/deployment_tools/inference_engine/lib/intel64/`|


## See Also
Expand Down
8 changes: 4 additions & 4 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ Useful documents for model optimization:
### Running and Tuning Inference
The other core component of OpenVINO™ is the [Inference Engine](IE_DG/Deep_Learning_Inference_Engine_DevGuide.md), which manages the loading and compiling of the optimized neural network model, runs inference operations on input data, and outputs the results. Inference Engine can execute synchronously or asynchronously, and its plugin architecture manages the appropriate compilations for execution on multiple Intel® devices, including both workhorse CPUs and specialized graphics and video processing platforms (see below, Packaging and Deployment).

You can use OpenVINO™ Tuning Utilities with the Inference Engine to trial and test inference on your model. The Benchmark utility uses an input model to run iterative tests for throughput or latency measures, and the [Cross Check Utility](../inference-engine/tools/cross_check_tool/README.md) compares performance of differently configured inferences.
You can use OpenVINO™ Tuning Utilities with the Inference Engine to trial and test inference on your model. The Benchmark utility uses an input model to run iterative tests for throughput or latency measures, and the [Cross Check Utility](../tools/cross_check_tool/README.md) compares performance of differently configured inferences.

For a full browser-based studio integrating these other key tuning utilities, try the [Deep Learning Workbench](@ref workbench_docs_Workbench_DG_Introduction).
![](img/OV-diagram-step3.png)
Expand Down Expand Up @@ -81,7 +81,7 @@ The Inference Engine's plug-in architecture can be extended to meet other specia
* [Deployment Manager Guide](./install_guides/deployment-manager-tool.md)


## OpenVINO™ Toolkit Components
## OpenVINO™ Toolkit Components

Intel® Distribution of OpenVINO™ toolkit includes the following components:

Expand All @@ -90,8 +90,8 @@ Intel® Distribution of OpenVINO™ toolkit includes the following components:
- [Inference Engine Samples](IE_DG/Samples_Overview.md): A set of simple console applications demonstrating how to use the Inference Engine in your applications.
- [Deep Learning Workbench](@ref workbench_docs_Workbench_DG_Introduction): A web-based graphical environment that allows you to easily use various sophisticated OpenVINO™ toolkit components.
- [Post-training Optimization Tool](@ref pot_README): A tool to calibrate a model and then execute it in the INT8 precision.
- Additional Tools: A set of tools to work with your models including [Benchmark App](../inference-engine/tools/benchmark_tool/README.md), [Cross Check Tool](../inference-engine/tools/cross_check_tool/README.md), [Compile tool](../inference-engine/tools/compile_tool/README.md).
- [Open Model Zoo](@ref omz_models_group_intel)
- Additional Tools: A set of tools to work with your models including [Benchmark App](../tools/benchmark_tool/README.md), [Cross Check Tool](../tools/cross_check_tool/README.md), [Compile tool](../tools/compile_tool/README.md).
- [Open Model Zoo](@ref omz_models_group_intel)
- [Demos](@ref omz_demos): Console applications that provide robust application templates to help you implement specific deep learning scenarios.
- Additional Tools: A set of tools to work with your models including [Accuracy Checker Utility](@ref omz_tools_accuracy_checker) and [Model Downloader](@ref omz_tools_downloader).
- [Documentation for Pretrained Models](@ref omz_models_group_intel): Documentation for pre-trained models that are available in the [Open Model Zoo repository](https://github.com/openvinotoolkit/open_model_zoo).
Expand Down
1 change: 0 additions & 1 deletion inference-engine/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@ if(ENABLE_PYTHON)
add_subdirectory(ie_bridges/python)
endif()

add_subdirectory(tools)
add_subdirectory(samples)

openvino_developer_export_targets(COMPONENT openvino_common TARGETS format_reader ie_samples_utils)
Expand Down
5 changes: 0 additions & 5 deletions inference-engine/tools/CMakeLists.txt

This file was deleted.

6 changes: 0 additions & 6 deletions inference-engine/tools/package_BOM.txt

This file was deleted.

8 changes: 0 additions & 8 deletions scripts/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -47,14 +47,6 @@ if(UNIX)
COMPONENT install_dependencies)
endif()

# install DeploymentManager

ie_cpack_add_component(deployment_manager REQUIRED)
install(DIRECTORY deployment_manager/
DESTINATION deployment_tools/tools/deployment_manager
COMPONENT deployment_manager
USE_SOURCE_PERMISSIONS)

# install files for demo

ie_cpack_add_component(demo_scripts DEPENDS core)
Expand Down
38 changes: 29 additions & 9 deletions tools/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -1,44 +1,64 @@
# Copyright (C) 2018-2021 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

cmake_minimum_required(VERSION 3.13)

project(python_tools)
project(OpenVINO_Tools DESCRIPTION "OpenVINO toolkit Development Tools")

if(NOT DEFINED OpenVINO_SOURCE_DIR)
find_package(InferenceEngineDeveloperPackage QUIET)
set(python_tools_only ON)
endif()

# C++ tools

if(NOT python_tools_only)
add_subdirectory(compile_tool)
endif()

# Python tools

# install deployment_manager

ie_cpack_add_component(deployment_manager REQUIRED)
install(DIRECTORY deployment_manager/
DESTINATION deployment_tools/tools/deployment_manager
COMPONENT deployment_manager
USE_SOURCE_PERMISSIONS)

if(ENABLE_PYTHON)
find_package(PythonInterp 3 REQUIRED)
set(PYTHON_VERSION python${PYTHON_VERSION_MAJOR}.${PYTHON_VERSION_MINOR})

set(TARGET_NAME "python_tools")

if(WIN32)
set(PYTHON_BRIDGE_OUTPUT_DIRECTORY ${CMAKE_LIBRARY_OUTPUT_DIRECTORY}/$<CONFIG>/python_api/${PYTHON_VERSION}/openvino)
else()
set(PYTHON_BRIDGE_OUTPUT_DIRECTORY ${CMAKE_LIBRARY_OUTPUT_DIRECTORY}/python_api/${PYTHON_VERSION}/openvino)
endif()

# creates a copy inside bin directory for developers to have ability running python benchmark_app
add_custom_target(${TARGET_NAME} ALL
add_custom_target(python_tools_benchmark ALL
COMMAND ${CMAKE_COMMAND} -E make_directory ${PYTHON_BRIDGE_OUTPUT_DIRECTORY}/tools
COMMAND ${CMAKE_COMMAND} -E copy_directory ${OpenVINO_SOURCE_DIR}/tools/benchmark_tool/openvino/tools/benchmark ${PYTHON_BRIDGE_OUTPUT_DIRECTORY}/tools/benchmark
COMMAND ${CMAKE_COMMAND} -E copy_directory ${CMAKE_CURRENT_SOURCE_DIR}/benchmark_tool/openvino/tools/benchmark
${PYTHON_BRIDGE_OUTPUT_DIRECTORY}/tools/benchmark
)

ie_cpack_add_component(python_tools_${PYTHON_VERSION})
ie_cpack_add_component(python_tools)

# install cross_check_tool tool
install(DIRECTORY cross_check_tool
DESTINATION deployment_tools/tools
COMPONENT python_tools)

# install benchmark_app tool
install(FILES benchmark_tool/benchmark_app.py
benchmark_tool/README.md
benchmark_tool/requirements.txt
DESTINATION deployment_tools/tools/benchmark_tool
COMPONENT python_tools)

install(DIRECTORY ../inference-engine/tools/cross_check_tool
DESTINATION deployment_tools/tools
COMPONENT python_tools)

# install openvino/tools/benchmark as a python package
install(DIRECTORY benchmark_tool/openvino/tools/benchmark
DESTINATION python/${PYTHON_VERSION}/openvino/tools
USE_SOURCE_PERMISSIONS
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.

0 comments on commit 1a6aace

Please sign in to comment.