-
Notifications
You must be signed in to change notification settings - Fork 0
BuildingForLinux
The software was validated on:
- Ubuntu* 18.04 (64-bit) with default GCC* 7.5.0
- Ubuntu* 20.04 (64-bit) with default GCC* 9.3.0
- CentOS* 7.6 (64-bit) with default GCC* 4.8.5
- [CMake]* 3.13 or higher
- GCC* 4.8 or higher to build the Inference Engine
- Python 3.6 or higher for Inference Engine Python API wrapper
- (Optional) [Install Intel® Graphics Compute Runtime for OpenCL™ Driver package 19.41.14441].
NOTE: Building samples and demos from the Intel® Distribution of OpenVINO™ toolkit package requires CMake* 3.10 or higher.
- Clone submodules:
cd openvino git submodule update --init --recursive
- Install build dependencies using the
install_build_dependencies.sh
script in the project root folder.chmod +x install_build_dependencies.sh
./install_build_dependencies.sh
- By default, the build enables the Inference Engine GPU plugin to infer models
on your Intel® Processor Graphics. This requires you to
[Install Intel® Graphics Compute Runtime for OpenCL™ Driver package 19.41.14441]
before running the build. If you don't want to use the GPU plugin, use the
-DENABLE_CLDNN=OFF
CMake build option and skip the installation of the Intel® Graphics Compute Runtime for OpenCL™ Driver. - Create a build folder:
mkdir build && cd build
- Inference Engine uses a CMake-based build system. In the created
build
directory, runcmake
to fetch project dependencies and create Unix makefiles, then runmake
to build the project:
cmake -DCMAKE_BUILD_TYPE=Release ..
make --jobs=$(nproc --all)
You can use the following additional build options:
-
The default build uses an internal JIT GEMM implementation.
-
To switch to an OpenBLAS* implementation, use the
GEMM=OPENBLAS
option withBLAS_INCLUDE_DIRS
andBLAS_LIBRARIES
CMake options to specify a path to the OpenBLAS headers and library. For example, the following options on CentOS*:-DGEMM=OPENBLAS -DBLAS_INCLUDE_DIRS=/usr/include/openblas -DBLAS_LIBRARIES=/usr/lib64/libopenblas.so.0
. -
To switch to the optimized MKL-ML* GEMM implementation, use
-DGEMM=MKL
and-DMKLROOT=<path_to_MKL>
CMake options to specify a path to unpacked MKL-ML with theinclude
andlib
folders. MKL-ML* package can be downloaded from the Intel® [MKL-DNN repository]. -
Threading Building Blocks (TBB) is used by default. To build the Inference Engine with OpenMP* threading, set the
-DTHREADING=OMP
option. -
Required versions of TBB and OpenCV packages are downloaded automatically by the CMake-based script. If you want to use the automatically downloaded packages but you already have installed TBB or OpenCV packages configured in your environment, you may need to clean the
TBBROOT
andOpenCV_DIR
environment variables before running thecmake
command, otherwise they will not be downloaded and the build may fail if incompatible versions were installed. -
If the CMake-based build script can not find and download the OpenCV package that is supported on your platform, or if you want to use a custom build of the OpenCV library, refer to the Use Custom OpenCV Builds section for details.
-
To build the Python API wrapper:
- Install all additional packages listed in the
/inference-engine/ie_bridges/python/requirements.txt
file:pip install -r requirements.txt
- Use the
-DENABLE_PYTHON=ON
option. To specify an exact Python version, use the following options:-DPYTHON_EXECUTABLE=`which python3.7` \ -DPYTHON_LIBRARY=/usr/lib/x86_64-linux-gnu/libpython3.7m.so \ -DPYTHON_INCLUDE_DIR=/usr/include/python3.7
- Install all additional packages listed in the
-
To switch the CPU and GPU plugins off/on, use the
cmake
options-DENABLE_MKL_DNN=ON/OFF
and-DENABLE_CLDNN=ON/OFF
respectively. -
nGraph-specific compilation options:
-DNGRAPH_ONNX_IMPORT_ENABLE=ON
enables the building of the nGraph ONNX importer.-DNGRAPH_DEBUG_ENABLE=ON
enables additional debug prints.
© Copyright 2018-2022, OpenVINO team
- Home
- General resources
- How to build
-
Developer documentation
- Inference Engine architecture
- OpenVINO Python API
- CPU plugin
- GPU plugin
- HETERO plugin architecture
- Snippets
- Sample for IE C++/C/Python API
- Proxy plugin (Concept)
- Tests