diff --git a/docs/img/DeviceDriverVersion.PNG b/docs/_static/images/DeviceDriverVersion.PNG similarity index 100% rename from docs/img/DeviceDriverVersion.PNG rename to docs/_static/images/DeviceDriverVersion.PNG diff --git a/docs/img/DeviceManager.PNG b/docs/_static/images/DeviceManager.PNG similarity index 100% rename from docs/img/DeviceManager.PNG rename to docs/_static/images/DeviceManager.PNG diff --git a/docs/install_guides/configurations-for-intel-gpu.md b/docs/install_guides/configurations-for-intel-gpu.md index 20345f1f23b564..db9cbcac07c16a 100644 --- a/docs/install_guides/configurations-for-intel-gpu.md +++ b/docs/install_guides/configurations-for-intel-gpu.md @@ -5,69 +5,82 @@ .. _gpu guide: -@endsphinxdirective In case if you are intended to use OpenVINO GPU plugin and offload network inference to Intel® graphics processor, the Intel Graphics Driver should be properly configured on your system. If it is already installed, and you want to keep it, you can skip the installation steps. -## Linux +Linux +##### -To install the latest available **Intel® Graphics Compute Runtime for OpenCL™** for your OS, see the [Install Guides](https://github.com/intel/compute-runtime/releases/latest). +To install the latest available **Intel® Graphics Compute Runtime for OpenCL™** for your OS, see the `Install Guides `__ . -> **NOTE**: If you use RedHat 8 OS please install OpenCL library as prerequisite via following command line: -> ```sh rpm -ivh http://mirror.centos.org/centos/8-stream/AppStream/x86_64/os/Packages/ocl-icd-2.2.12-1.el8.x86_64.rpm``` +.. note:: + If you use RedHat 8 OS please install OpenCL library as prerequisite via following command line: ```sh rpm -ivh http://mirror.centos.org/centos/8-stream/AppStream/x86_64/os/Packages/ocl-icd-2.2.12-1.el8.x86_64.rpm``` -> **NOTE**: For instructions specific to discrete graphics platforms, refer to [the dgpu guide](https://dgpu-docs.intel.com/installation-guides/index.html) (Intel® Arc™ A-Series Graphics, Intel® Data Center GPU Flex Series, Intel® Data Center GPU MAX Series, Intel® processor graphics Gen12, and Intel® Iris Xe MAX codename DG1). +.. note:: + For instructions specific to discrete graphics platforms, refer to `the dgpu guide `__ (Intel® Arc™ A-Series Graphics, Intel® Data Center GPU Flex Series, Intel® Data Center GPU MAX Series, Intel® processor graphics Gen12, and Intel® Iris Xe MAX codename DG1). You may consider installing one of the earlier versions of the driver, based on your particular setup needs. -It is recommended that you refer to the [Intel® Graphics Compute Runtime Github page](https://github.com/intel/compute-runtime/) for instructions and recommendations on GPU driver installation specific to particular releases, including the list of supported hardware platforms. - +It is recommended that you refer to the `Intel® Graphics Compute Runtime Github page `__ for instructions and recommendations on GPU driver installation specific to particular releases, including the list of supported hardware platforms. -@sphinxdirective .. _gpu guide windows: -@endsphinxdirective -## Windows +Windows +####### -To install the Intel Graphics Driver for Windows on your hardware, please proceed with the [instruction](https://www.intel.com/content/www/us/en/support/articles/000005629/graphics.html). +To install the Intel Graphics Driver for Windows on your hardware, please proceed with the `instruction `__ . To check if you have this driver installed: 1. Type **device manager** in your **Search Windows** box and press Enter. The **Device Manager** opens. 2. Click the drop-down arrow to view the **Display adapters**. You can see the adapter that is installed in your computer: -![](../img/DeviceManager.PNG) + + .. image:: _static/images/DeviceManager.PNG + 3. Right-click the adapter name and select **Properties**. 4. Click the **Driver** tab to see the driver version. -![](../img/DeviceDriverVersion.PNG) + .. image:: _static/images/DeviceDriverVersion.PNG You are done updating your device driver and are ready to use your GPU. -## Additional info +Additional info +############### In the internal OpenVINO validation the following versions of Intel Graphics Driver were used: -Operation System | Driver version ---- |------------------------- -Ubuntu 20.04 | [22.35.24055](https://github.com/intel/compute-runtime/releases/tag/22.35.24055) -Ubuntu 18.04 | [21.38.21026](https://github.com/intel/compute-runtime/releases/tag/21.38.21026) -CentOS 7 | [19.41.14441](https://github.com/intel/compute-runtime/releases/tag/19.41.14441) -RHEL 8 | [22.28.23726](https://github.com/intel/compute-runtime/releases/tag/22.28.23726) ++------------------+-------------------------------------------------------------------------------------+ +| Operation System | Driver version | ++==================+=====================================================================================+ +| Ubuntu 20.04 | `22.35.24055 `__ | +| Ubuntu 18.04 | `21.38.21026 `__ | +| CentOS 7 | `19.41.14441 `__ | +| RHEL 8 | `22.28.23726 `__ | ++------------------+-------------------------------------------------------------------------------------+ + -## What’s Next? +What’s Next? +############ You can try out the toolkit with: Developing in Python: - * [Start with tensorflow models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/101-tensorflow-to-openvino-with-output.html) - * [Start with ONNX and PyTorch models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/102-pytorch-onnx-to-openvino-with-output.html) - * [Start with PaddlePaddle models with OpenVINO™](https://docs.openvino.ai/latest/notebooks/103-paddle-onnx-to-openvino-classification-with-output.html) + +* `Start with tensorflow models with OpenVINO™ `__ +* `Start with ONNX and PyTorch models with OpenVINO™ `__ +* `Start with PaddlePaddle models with OpenVINO™ `__ Developing in C++: - * [Image Classification Async C++ Sample](@ref openvino_inference_engine_samples_classification_sample_async_README) - * [Hello Classification C++ Sample](@ref openvino_inference_engine_samples_hello_classification_README) - * [Hello Reshape SSD C++ Sample](@ref openvino_inference_engine_samples_hello_reshape_ssd_README) + +* :doc:`Image Classification Async C++ Sample ` +* :doc:`Hello Classification C++ Sample ` +* :doc:`Hello Reshape SSD C++ Sample ` + + +@endsphinxdirective + + diff --git a/docs/install_guides/configurations-header.md b/docs/install_guides/configurations-header.md index 9329534c07024a..3e287fd341177e 100644 --- a/docs/install_guides/configurations-header.md +++ b/docs/install_guides/configurations-header.md @@ -11,11 +11,11 @@ For GPU For GNA -@endsphinxdirective - After you have installed OpenVINO™ Runtime, you may also need do some additional configurations for your device to work with OpenVINO™. See the following pages: -* [Configurations for GPU](configurations-for-intel-gpu.md) -* [Configurations for GNA](configurations-for-intel-gna.md) +* :doc:`Configurations for GPU ` +* :doc:`Configurations for GNA ` + +@endsphinxdirective