You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
OpenVINO converts the ONNX model failed and threw "Trilu second input must be a scalar".
However, the given ONNX model is a valid model which can execute inference well.
Thus, we hope OpenVINO can also support it rather than crash!
Step-by-step reproduction
import openvino as ov
onnx_model_path = 'Trilu.onnx'
ov_model = ov.convert_model(onnx_model_path)
Relevant log output
Traceback (most recent call last):
File "test.py", line 15, in<module>
ov_model = ov.convert_model(onnx_model_path)
File "C:\software\conda\envs\torch\lib\site-packages\openvino\tools\ovc\convert.py", line 101, in convert_model
ov_model, _ = _convert(cli_parser, params, True)
File "C:\software\conda\envs\torch\lib\site-packages\openvino\tools\ovc\convert_impl.py", line 524, in _convert
raise e
File "C:\software\conda\envs\torch\lib\site-packages\openvino\tools\ovc\convert_impl.py", line 476, in _convert
ov_model = driver(argv, {"conversion_parameters": non_default_params})
File "C:\software\conda\envs\torch\lib\site-packages\openvino\tools\ovc\convert_impl.py", line 226, in driver
ov_model = moc_emit_ir(prepare_ir(argv), argv)
File "C:\software\conda\envs\torch\lib\site-packages\openvino\tools\ovc\convert_impl.py", line 172, in prepare_ir
ov_model = moc_pipeline(argv, moc_front_end)
File "C:\software\conda\envs\torch\lib\site-packages\openvino\tools\ovc\moc_frontend\pipeline.py", line 247, in moc_pipeline
ov_model = moc_front_end.convert(input_model)
File "C:\software\conda\envs\torch\lib\site-packages\openvino\frontend\frontend.py", line 18, in convert
converted_model = super().convert(model)
RuntimeError: Check 'error_message.empty()' failed at src\frontends\onnx\frontend\src\frontend.cpp:124:
Errors during ONNX translation:
Check '(inputs[1].get_partial_shape().compatible({}))' failed at src\frontends\onnx\frontend\src\op\trilu.cpp:30:
While validating ONNX node '<Node(Trilu): y>':
Trilu second input must be a scalar
Issue submission checklist
I'm reporting an issue. It's not a question.
I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
There is reproducer code and related data files such as images, videos, models, etc.
The text was updated successfully, but these errors were encountered:
According to the Trilu specification (https://github.com/onnx/onnx/blob/main/docs/Changelog.md#Trilu-14), the second input is 0D tensor (or scalar). I think it is a bug in either software creating such ONNX model (PyTorch export to onnx, for example) or ONNX specification. I am leaning the first one.
@gkrivor, anyway let us fix it in OV side because the fix should be easy.
OpenVINO Version
openvino-nightly 2023.2.0.dev20231101
Operating System
Ubuntu 18.04 (LTS)
Device used for inference
CPU
Framework
ONNX
Model used
https://github.com/jikechao/onnx_models/blob/main/Trilu.onnx
Issue description
OpenVINO converts the ONNX model failed and threw "Trilu second input must be a scalar".
However, the given ONNX model is a valid model which can execute inference well.
Thus, we hope OpenVINO can also support it rather than crash!
Step-by-step reproduction
Relevant log output
Issue submission checklist
The text was updated successfully, but these errors were encountered: