Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model optimizer from ONNX error: old and new shapes do not match #133

Closed
wuhy08 opened this issue Apr 25, 2019 · 14 comments
Closed

Model optimizer from ONNX error: old and new shapes do not match #133

wuhy08 opened this issue Apr 25, 2019 · 14 comments

Comments

@wuhy08
Copy link

wuhy08 commented Apr 25, 2019

Hi,

I am trying to convert my ONNX model using mo.py and it throws the error shown below.

My onnx model is attached here.

Not sure if it is due to the Slice op.

PS. There were some issues with Upsample (#93 (comment)). I fixed that by manually check the node in the onnx model.

[ 2019-04-25 16:05:32,070 ] [ DEBUG ] [ main:133 ]  Namespace(batch=None, counts=None, data_type='float', disable_fusing=False, disable_gfusing=False, disable_nhwc_to_nchw=False, disable_omitting_optional=False, disable_resnet_optimization=False, enable_concat_optimization=False, enable_flattening_nested_params=False, extensions='/opt/intel/openvino_2019.1.094/deployment_tools/model_optimizer/extensions', finegrain_fusing=None, framework='onnx', freeze_placeholder_with_value=None, generate_deprecated_IR_V2=False, input=None, input_checkpoint=None, input_meta_graph=None, input_model='/home/wd-ai/Downloads/yolov3_manual_np_tile.onnx', input_model_is_text=False, input_proto=None, input_shape=None, input_symbol=None, k='/opt/intel/openvino/deployment_tools/model_optimizer/extensions/front/caffe/CustomLayersMapping.xml', keep_shape_ops=False, legacy_mxnet_model=False, log_level='DEBUG', mean_file=None, mean_file_offsets=None, mean_values=(), model_name=None, move_to_preprocess=False, nd_prefix_name=None, output=None, output_dir='/home/wd-ai/openvino_models/yolov3/.', pretrained_model_name=None, remove_output_softmax=False, reverse_input_channels=False, save_params_from_nd=False, saved_model_dir=None, saved_model_tags=None, scale=None, scale_values=(), silent=False, tensorboard_logdir=None, tensorflow_custom_layer_libraries=None, tensorflow_custom_operations_config_update=None, tensorflow_object_detection_api_pipeline_config=None, tensorflow_operation_patterns=None, tensorflow_subgraph_patterns=None, tensorflow_use_custom_operations_config=None, version=False)
[ 2019-04-25 16:05:32,070 ] [ DEBUG ] [ main:134 ]  Model Optimizer started
[ 2019-04-25 16:05:32,070 ] [ DEBUG ] [ main:148 ]  Output model name would be yolov3_manual_np_tile{.xml, .bin}
Model Optimizer arguments:
Common parameters:
	- Path to the Input Model: 	/home/wd-ai/Downloads/yolov3_manual_np_tile.onnx
	- Path for generated IR: 	/home/wd-ai/openvino_models/yolov3/.
	- IR output name: 	yolov3_manual_np_tile
	- Log level: 	DEBUG
	- Batch: 	Not specified, inherited from the model
	- Input layers: 	Not specified, inherited from the model
	- Output layers: 	Not specified, inherited from the model
	- Input shapes: 	Not specified, inherited from the model
	- Mean values: 	Not specified
	- Scale values: 	Not specified
	- Scale factor: 	Not specified
	- Precision of IR: 	FP32
	- Enable fusing: 	True
	- Enable grouped convolutions fusing: 	True
	- Move mean values to preprocess section: 	False
	- Reverse input channels: 	False
ONNX specific parameters:
Model Optimizer version: 	2019.1.0-341-gc9b66a2
[ 2019-04-25 16:05:32,158 ] [ DEBUG ] [ main:234 ]  Placeholder shapes : None
[ INFO ]  Importing extensions from: /opt/intel/openvino_2019.1.094/deployment_tools/model_optimizer/mo
[ WARNING ]  Skipped <class 'extensions.middle.AddQuantizeFuse.AddQuantizeFuse'> registration because it was already registered or it was disabled. 
[ INFO ]  New subclass: <class 'extensions.middle.EltwiseInputReshape.Eltwise1DInputReshape'>
[ INFO ]  New subclass: <class 'extensions.middle.EltwiseInputReshape.EltwiseInputReshape'>
[ WARNING ]  Skipped <class 'extensions.middle.EltwiseInputNormalization.EltwiseInputNormalize'> registration because it was already registered or it was disabled. 
[ WARNING ]  Skipped <class 'extensions.middle.MulQuantizeFuse.MulQuantizeFuse'> registration because it was already registered or it was disabled. 
[ INFO ]  New subclass: <class 'mo.front.common.replacement.FrontReplacementSubgraph'>
[ INFO ]  New subclass: <class 'mo.front.common.replacement.FrontReplacementOp'>
[ INFO ]  Registered a new subclass with key: UnknownOp

...
...
...
[ 2019-04-25 15:56:20,897 ] [ DEBUG ] [ class_registration:164 ]  Run replacer <class 'extensions.middle.PixelLinkReshape.PixelLinkReshape'>
[ 2019-04-25 15:56:21,006 ] [ DEBUG ] [ class_registration:164 ]  Run replacer <class 'extensions.middle.TF_lstm_cell_to_generic.TensorFlowLSTMtoGeneric'>
[ 2019-04-25 15:56:21,027 ] [ DEBUG ] [ class_registration:164 ]  Run replacer <class 'extensions.middle.Reduce.ReduceReplacer'>
[ 2019-04-25 15:56:21,051 ] [ DEBUG ] [ class_registration:164 ]  Run replacer <class 'extensions.middle.SliceConverter.ConvertSlice'>
[ 2019-04-25 15:56:21,061 ] [ DEBUG ] [ op:204 ]  Start running infer function for individual op node with attributes: {'op': 'Const', 'precision': 'FP32', 'type': 'Const', 'data_type': None, '_in_ports': set(), 'value': '[0 0 0 0 4]', 'infer': <function tf_const_infer at 0x7f3cb8f68e18>, 'name': 'begin', 'force_precision': 'I32', '_out_ports': {0}, 'shape': array([5]), 'IE': [('layer', [('id', <function Op.substitute_ie_attrs.<locals>.<lambda> at 0x7f3cb80a8bf8>), 'name', 'precision', 'type'], [('data', [], []), '@ports', '@consts'])], 'out_ports_count': 1, 'kind': 'op', 'shape_attrs': ['pad', 'stride', 'shape', 'output_shape', 'window'], 'dim_attrs': ['batch_dims', 'channel_dims', 'axis', 'spatial_dims']}
[ 2019-04-25 15:56:21,061 ] [ DEBUG ] [ op:217 ]  Finished running infer function, data nodes attributes: {'name': '6187', 'dim_attrs': ['batch_dims', 'channel_dims', 'axis', 'spatial_dims'], '_out_ports': set(), 'shape': array([5]), 'precision': 'FP32', 'value': '[0 0 0 0 4]', 'kind': 'data', 'data_type': None, '_in_ports': set(), 'shape_attrs': ['pad', 'stride', 'shape', 'output_shape', 'window'], 'infer': None}
[ 2019-04-25 15:56:21,062 ] [ DEBUG ] [ op:204 ]  Start running infer function for individual op node with attributes: {'op': 'Const', 'precision': 'FP32', 'type': 'Const', 'data_type': None, '_in_ports': set(), 'value': '[0 0 0 0 0]', 'infer': <function tf_const_infer at 0x7f3cb8f68e18>, 'name': 'end', 'force_precision': 'I32', '_out_ports': {0}, 'shape': array([5]), 'IE': [('layer', [('id', <function Op.substitute_ie_attrs.<locals>.<lambda> at 0x7f3cb80a8b70>), 'name', 'precision', 'type'], [('data', [], []), '@ports', '@consts'])], 'out_ports_count': 1, 'kind': 'op', 'shape_attrs': ['pad', 'stride', 'shape', 'output_shape', 'window'], 'dim_attrs': ['batch_dims', 'channel_dims', 'axis', 'spatial_dims']}
[ 2019-04-25 15:56:21,062 ] [ DEBUG ] [ op:217 ]  Finished running infer function, data nodes attributes: {'name': '6189', 'dim_attrs': ['batch_dims', 'channel_dims', 'axis', 'spatial_dims'], '_out_ports': set(), 'shape': array([5]), 'precision': 'FP32', 'value': '[0 0 0 0 0]', 'kind': 'data', 'data_type': None, '_in_ports': set(), 'shape_attrs': ['pad', 'stride', 'shape', 'output_shape', 'window'], 'infer': None}
[ 2019-04-25 15:56:21,064 ] [ DEBUG ] [ op:204 ]  Start running infer function for individual op node with attributes: {'end_mask': array([0, 0, 0, 0, 0], dtype=int32), 'out_ports_count': 1, 'precision': 'FP32', 'type': 'StridedSlice', 'in_ports_count': 4, 'shape_attrs': ['pad', 'stride', 'shape', 'output_shape', 'window'], 'new_axis_mask': array([0, 0, 0, 0, 0], dtype=int32), 'dim_attrs': ['batch_dims', 'channel_dims', 'axis', 'spatial_dims'], 'infer': <function StridedSlice.infer at 0x7f3cb88b07b8>, 'name': '6190', 'ellipsis_mask': array([0, 0, 0, 0, 0], dtype=int32), '_out_ports': {0}, 'shrink_axis_mask': array([0, 0, 0, 0, 0], dtype=int32), 'IE': [('layer', [('id', <function Op.substitute_ie_attrs.<locals>.<lambda> at 0x7f3cb80a8c80>), 'name', 'precision', 'type'], [('data', [('new_axis_mask', <function StridedSlice.backend_attrs.<locals>.convert.<locals>.<lambda> at 0x7f3cb809b730>), ('shrink_axis_mask', <function StridedSlice.backend_attrs.<locals>.convert.<locals>.<lambda> at 0x7f3cb809b400>), ('ellipsis_mask', <function StridedSlice.backend_attrs.<locals>.convert.<locals>.<lambda> at 0x7f3cb809b1e0>), ('begin_mask', <function StridedSlice.backend_attrs.<locals>.convert.<locals>.<lambda> at 0x7f3cb809b0d0>), ('end_mask', <function StridedSlice.backend_attrs.<locals>.convert.<locals>.<lambda> at 0x7f3cb809b950>)], []), '@ports', '@consts'])], '_in_ports': {0, 1, 2, 3}, 'op': 'StridedSlice', 'kind': 'op', 'begin_mask': array([0, 0, 0, 0, 1], dtype=int32)}
[ ERROR ]  -------------------------------------------------
[ ERROR ]  ----------------- INTERNAL ERROR ----------------
[ ERROR ]  Unexpected exception happened.
[ ERROR ]  Please contact Model Optimizer developers and forward the following information:
[ ERROR ]  Exception occurred during running replacer "REPLACEMENT_ID (<class 'extensions.middle.SliceConverter.ConvertSlice'>)": After re-inference of 6190 node, old and new shapes do not match. Old shapes: [array([ 1,  3, 19, 19,  1])], new shapes: [array([ 1,  3, 19, 19, 81])].
[ ERROR ]  Traceback (most recent call last):
  File "/opt/intel/openvino_2019.1.094/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 167, in apply_replacements
    replacer.find_and_replace_pattern(graph)
  File "/opt/intel/openvino_2019.1.094/deployment_tools/model_optimizer/mo/utils/replacement_pattern.py", line 29, in find_and_replace_pattern
    apply_pattern(graph, **self.pattern(), action=self.replace_pattern)  # pylint: disable=no-member
  File "/opt/intel/openvino_2019.1.094/deployment_tools/model_optimizer/mo/middle/pattern_match.py", line 95, in apply_pattern
    action(graph, match)
  File "/opt/intel/openvino_2019.1.094/deployment_tools/model_optimizer/extensions/middle/SliceConverter.py", line 100, in replace_pattern
    ss.create_node_with_data(inputs=[input, begin_node, end_node], data_nodes=[output_data])
  File "/opt/intel/openvino_2019.1.094/deployment_tools/model_optimizer/mo/ops/op.py", line 213, in create_node_with_data
    [data_node.shape for data_node in data_nodes])
AssertionError: After re-inference of 6190 node, old and new shapes do not match. Old shapes: [array([ 1,  3, 19, 19,  1])], new shapes: [array([ 1,  3, 19, 19, 81])].

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/intel/openvino_2019.1.094/deployment_tools/model_optimizer/mo/main.py", line 312, in main
    return driver(argv)
  File "/opt/intel/openvino_2019.1.094/deployment_tools/model_optimizer/mo/main.py", line 289, in driver
    ret_res = mo_onnx.driver(argv, argv.input_model, model_name, argv.output_dir)
  File "/opt/intel/openvino_2019.1.094/deployment_tools/model_optimizer/mo/pipeline/onnx.py", line 87, in driver
    class_registration.apply_replacements(graph, class_registration.ClassType.MIDDLE_REPLACER)
  File "/opt/intel/openvino_2019.1.094/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 190, in apply_replacements
    )) from err
Exception: Exception occurred during running replacer "REPLACEMENT_ID (<class 'extensions.middle.SliceConverter.ConvertSlice'>)": After re-inference of 6190 node, old and new shapes do not match. Old shapes: [array([ 1,  3, 19, 19,  1])], new shapes: [array([ 1,  3, 19, 19, 81])].

[ ERROR ]  ---------------- END OF BUG REPORT --------------
[ ERROR ]  -------------------------------------------------
@wuhy08
Copy link
Author

wuhy08 commented Apr 26, 2019

After some investigation, it seem the error pops up under this line... to be continued

https://github.com/opencv/dldt/blob/b235c73481a6f133e1d182d91e5e2c29e8a460e5/model-optimizer/extensions/middle/SliceConverter.py#L100

@shubha-ramani
Copy link

Dearest @wuhy08

Thanks for attaching your onnx model. I have downloaded it myself and reproduced your error on OpenVino 2019 R1.0.1 ( a new version has just been released). Yes I remember the Upsample issue you mention but I didn't see it in this new release, so it's been fixed.

I will debug your issue and post results here.

Thanks for your patience and thank you for using OpenVino !

Shubha

@shubha-ramani
Copy link

Dearest @wuhy08

From where did you get this model ? Or did you build it yourself ?

Thanks,

Shubha

@wuhy08
Copy link
Author

wuhy08 commented Apr 26, 2019

Hi @shubha-ramani
I built the model by my own. I manually changed Upsample Op to ONNX opset-7 in the model using ONNX helper so mo.py won't complain when it met Upsample. But it complains about Slice.
Best,

Haoyu

@shubha-ramani
Copy link

Dear @wuhy08 ok ok. So the Upsample thing was not magically fixed yet. (You fixed it). Got it. Does this model actually work ? I mean does it successfully perform inference outside of OpenVino ?

I'm still investigating...thanks for your patience !

Shubha

@wuhy08
Copy link
Author

wuhy08 commented Apr 26, 2019

Hi @shubha-ramani . The model was converted from pytorch. The inference in pytorch works as expected. I haven't tested whether it works as an ONNX model. I see in the ONNX file, some Slice ops has attribute "ends = 2^63-1", since I usedt[begin:]in pytorch to do the slicing. Not sure if that causes the problem. I am modifying the model now. I will let you know the result soon.

@wuhy08
Copy link
Author

wuhy08 commented Apr 26, 2019

@shubha-ramani

I modified the model and it gave out the same error. For your reference, I have posted my modified model here.

Note the differences in some of the Slice ops. (I used onnx.helper.printable_graph).

BTW, if you use onnx.checker.check_model, it will throw an error, due to my manual change of the Upsample op.

@wuhy08
Copy link
Author

wuhy08 commented Apr 26, 2019

@shubha-ramani
I think I have found the bug. The error happen when the SliceConverter fails to parse the slice-end. I have submitted a PR. See #136

@shubha-ramani
Copy link

Dear @wuhy08
Wow ! Congratulations ! I was going to work on it later this evening but you beat me to the goal. We appreciate it. I will contact the developer who is responsible for this area of the code immediately. Thanks for using OpenVino !

Shubha

@shubha-ramani
Copy link

Dear @wuhy08
The fix is already in the master branch. The fix will be available with the 2019R2 release. So your PR will likely get ignored since the developers already know about this bug.

Thanks for using OpenVino !

Shubha

@wuhy08
Copy link
Author

wuhy08 commented Apr 28, 2019

@shubha-ramani

Glad to hear that. I am gonna close the issue and revoke the access to my files now. Have a great Sunday!

@wuhy08 wuhy08 closed this as completed Apr 28, 2019
@Fighting-JJ
Copy link

Thanks for you @wuhy08 @shubha-ramani.
I met the same problems during converting Onnx to IR model, with the bug information:

Exception: Exception occurred during running replacer "REPLACEMENT_ID (<class 'extensions.middle.SliceConverter.ConvertSlice'>)"

There is two things I want to know:
1、What have you done to get the IR model correctly? @wuhy08
2、When will the 2019R2-version be released? @shubha-ramani

@Fighting-JJ
Copy link

Thanks for you @wuhy08 @shubha-ramani.
I met the same problems during converting Onnx to IR model, with the bug information:

Exception: Exception occurred during running replacer "REPLACEMENT_ID (<class 'extensions.middle.SliceConverter.ConvertSlice'>)"

There is two things I want to know:
1、What have you done to get the IR model correctly? @wuhy08
2、When will the 2019R2-version be released? @shubha-ramani

I have found the changed code according to #136
Thanks a lot ! awsome!!!

@shubha-ramani
Copy link

Dearest @Fighting-JJ
Awesome indeed ! Glad you found the changed code in the pull request.

Thanks,

Shubha

eshoguli pushed a commit to eshoguli/openvino that referenced this issue Jun 1, 2021
riverlijunjie added a commit that referenced this issue Jun 6, 2022
rengolin added a commit to rengolin/openvino that referenced this issue Jul 19, 2024
Mostly static declarations, but one unnecessarily wide lambda capture.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants