Skip to content

Commit

Permalink
Merge upstreream/master
Browse files Browse the repository at this point in the history
  • Loading branch information
Mikhail Treskin committed Sep 10, 2020
2 parents 7b06f04 + ef2581d commit f647cda
Show file tree
Hide file tree
Showing 357 changed files with 16,549 additions and 7,425 deletions.
33 changes: 30 additions & 3 deletions .github/workflows/code_style.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,10 +31,37 @@ jobs:
if: failure()
run: |
ngraph/maint/apply-code-format.sh
git diff >code_style_diff.patch
git diff >ngraph_code_style_diff.patch
- uses: actions/upload-artifact@v2
if: failure()
with:
name: code_style_diff
path: code_style_diff.patch
name: ngraph_code_style_diff
path: ngraph_code_style_diff.patch

Java:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v2
- uses: actions/setup-java@v1
with:
java-version: '11'

- name: Install dependencies
run: |
wget -nc https://github.com/google/google-java-format/releases/download/google-java-format-1.9/google-java-format-1.9-all-deps.jar
- name: Check code style
run: |
java -jar google-java-format-1.9-all-deps.jar --set-exit-if-changed -a -i $(find . -type f -name "*.java")
- name: Create code style diff
if: failure()
run: |
git diff >java_code_style_diff.patch
- uses: actions/upload-artifact@v2
if: failure()
with:
name: java_code_style_diff
path: java_code_style_diff.patch
3 changes: 3 additions & 0 deletions CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,9 @@ azure-pipelines.yml @openvinotoolkit/openvino-admins
# QA Tests:
/tests/ @openvinotoolkit/openvino-tests-maintainers

# OpenVINO Scripts
/scripts/ @openvinotoolkit/openvino-admins @openvinotoolkit/openvino-scripts-maintainers

# IE Core:
/inference-engine/ @openvinotoolkit/openvino-ie-maintainers
/inference-engine/ie_bridges/python @openvinotoolkit/openvino-ie-python-api-maintainers
Expand Down
2 changes: 1 addition & 1 deletion docs/IE_DG/Integrate_with_customer_application_new_API.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ InferenceEngine::Core core;
```cpp
auto network = core.ReadNetwork("Model.xml");
```
**Or read the model from ONNX format** (.onnx and .prototxt are supported formats). You can find more information about the ONNX format support in the document [ONNX format support in the OpenVINO™](./ONNX_Supported_Ops.md).
**Or read the model from ONNX format** (.onnx and .prototxt are supported formats). You can find more information about the ONNX format support in the document [ONNX format support in the OpenVINO™](./ONNX_Support.md).
```cpp
auto network = core.ReadNetwork("model.onnx");
```
Expand Down
2 changes: 1 addition & 1 deletion docs/IE_DG/Introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ Please refer to the [Overview of nGraph Flow](nGraph_Flow.md) describing the det

Inference Engine is a runtime that delivers a unified API to integrate the inference with application logic:

* Takes a model as an input. The model can be presented in [the native ONNX format](./ONNX_Supported_Ops.md) or in the specific form of [Intermediate Representation (IR)](../MO_DG/IR_and_opsets.md)
* Takes a model as an input. The model can be presented in [the native ONNX format](./ONNX_Support.md) or in the specific form of [Intermediate Representation (IR)](../MO_DG/IR_and_opsets.md)
produced by Model Optimizer.
* Optimizes inference execution for target hardware.
* Delivers inference solution with reduced footprint on embedded inference platforms.
Expand Down
2 changes: 1 addition & 1 deletion docs/IE_DG/Migration_CoreAPI.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ read networks using the Core class:
```cpp
CNNNetwork network = core.ReadNetwork(input_model);
```
The Core class also allows reading models from the ONNX format (more information is [here](./ONNX_Supported_Ops.md)):
The Core class also allows reading models from the ONNX format (more information is [here](./ONNX_Support.md)):
```cpp
CNNNetwork network = core.ReadNetwork("model.onnx");
```
Expand Down
19 changes: 19 additions & 0 deletions docs/IE_DG/ONNX_Support.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# ONNX format support in the OpenVINO™ {#openvino_docs_IE_DG_ONNX_Support}

Starting from the 2020.4 release, OpenVINO™ supports reading native ONNX models.
`Core::ReadNetwork()` method provides a uniform way to read models from IR or ONNX format, it is a recommended approach to reading models.
Example:

```cpp
InferenceEngine::Core core;
auto network = core.ReadNetwork("model.onnx");
```

OpenVINO™ doesn't provide a mechanism to specify pre-processing (like mean values subtraction, reverse input channels) for the ONNX format.
If an ONNX model contains dynamic shapes for input, please use the `CNNNetwork::reshape` method for shape specialization.

Unsupported types of tensors:

* `string`,
* `complex64`,
* `complex128`.
215 changes: 0 additions & 215 deletions docs/IE_DG/ONNX_Supported_Ops.md

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,6 @@ Framework-agnostic parameters:
--disable_gfusing Turn off fusing of grouped convolutions
--enable_concat_optimization
Turn on Concat optimization.
--move_to_preprocess Move mean values to IR preprocess section
--extensions EXTENSIONS
Directory or a comma separated list of directories
with extensions. To disable all extensions including
Expand Down
2 changes: 1 addition & 1 deletion docs/doxygen/ie_docs.xml
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,7 @@
<tab type="user" title="Introduction to Performance Topics" url="@ref openvino_docs_IE_DG_Intro_to_Performance"/>
<tab type="user" title="Inference Engine Python* API Overview" url="@ref openvino_inference_engine_ie_bridges_python_docs_api_overview"/>
<tab type="user" title="Build a Model with nGraph" url="@ref openvino_docs_IE_DG_nGraphTutorial"/>
<tab type="user" title="Read an ONNX model" url="@ref openvino_docs_IE_DG_ONNX_Supported_Ops"/>
<tab type="user" title="Read an ONNX model" url="@ref openvino_docs_IE_DG_ONNX_Support"/>
<tab type="user" title="[DEPRECATED] Import an ONNX model" url="@ref openvino_docs_IE_DG_OnnxImporterTutorial"/>
<tab type="user" title="Graph Debug Capabilities" url="@ref openvino_docs_IE_DG_Graph_debug_capabilities"/>
<tab type="user" title="Using Dynamic Batching Feature" url="@ref openvino_docs_IE_DG_DynamicBatching"/>
Expand Down
3 changes: 0 additions & 3 deletions docs/ops/opset4.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,6 @@ declared in `namespace opset4`.
* [GroupConvolution](convolution/GroupConvolution_1.md)
* [GroupConvolutionBackpropData](convolution/GroupConvolutionBackpropData_1.md)
* [GRUCell](sequence/GRUCell_3.md)
* [GRUSequence](sequence/GRUSequence_4.md)
* [HardSigmoid](activation/HardSigmoid_1.md)
* [HSwish](activation/HSwish_4.md)
* [Interpolate](image/Interpolate_4.md)
Expand All @@ -75,7 +74,6 @@ declared in `namespace opset4`.
* [LogicalXor](logical/LogicalXor_1.md)
* [LRN](normalization/LRN_1.md)
* [LSTMCell](sequence/LSTMCell_1.md)
* [LSTMSequence](sequence/LSTMSequence_1.md)
* [MatMul](matrix/MatMul_1.md)
* [MaxPool](pooling/MaxPool_1.md)
* [Maximum](arithmetic/Maximum_1.md)
Expand Down Expand Up @@ -117,7 +115,6 @@ declared in `namespace opset4`.
* [Reverse](movement/Reverse_1.md)
* [ReverseSequence](movement/ReverseSequence_1.md)
* [RNNCell](sequence/RNNCell_3.md)
* [RNNSequence](sequence/RNNSequence_4.md)
* [ROIAlign](detection/ROIAlign_3.md)
* [ROIPooling](detection/ROIPooling_1.md)
* [ScatterElementsUpdate](movement/ScatterElementsUpdate_3.md)
Expand Down
Loading

0 comments on commit f647cda

Please sign in to comment.