This toolkit allows developers to deploy pre-trained deep learning models through a high-level OpenVINO™ Runtime C++ and Python APIs integrated with application logic.
This open source version includes several components: namely Model Optimizer, OpenVINO™ Runtime, Post-Training Optimization Tool, as well as CPU, GPU, MYRIAD, multi device and heterogeneous plugins to accelerate deep learning inferencing on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from the Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi.
OpenVINO™ Toolkit is licensed under Apache License Version 2.0. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
- Docs: https://docs.openvino.ai/
- Wiki: https://github.com/openvinotoolkit/openvino/wiki
- Issue tracking: https://github.com/openvinotoolkit/openvino/issues
- Storage: https://storage.openvinotoolkit.org/
- Additional OpenVINO™ toolkit modules: https://github.com/openvinotoolkit/openvino_contrib
- Intel® Distribution of OpenVINO™ toolkit Product Page
- Intel® Distribution of OpenVINO™ toolkit Release Notes
Please report questions, issues and suggestions using:
- The
openvino
tag on StackOverflow* - GitHub* Issues
- Forum
* Other names and brands may be claimed as the property of others.