Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add similarity check to ONNX model conversions #2915

Merged
merged 2 commits into from
May 30, 2024

Conversation

RunDevelopment
Copy link
Member

@RunDevelopment RunDevelopment commented May 28, 2024

Closes #2585.

This adds a new check to ensure that the ONNX models of PyTorch models produce the same outputs. This new check is options and off by default, because it requires an ONNX runtime.

I also cleaned up some other code along the way.

To the reviewer: please test this. I don't really do ONNX conversions, so I just tested it on a few models and it seemed to work.

@joeyballentine
Copy link
Member

Errors when running in fp16 mode:

image

Full stack trace:


An error occurred in a Convert To ONNX node:

[ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Unexpected input data type. Actual: (tensor(float)) , expected: (tensor(float16))

Input values:
• PyTorch Model: Value of type 'spandrel.__helpers.model_descriptor.ImageModelDescriptor'
• Data Type: fp16
• Opset: 17
• Verify: Yes

Stack Trace:
Traceback (most recent call last):
  File "C:\Users\joeyj\Documents\Git\chaiNNer\backend\src\process.py", line 174, in run_node
    raw_output = node.run(context, *enforced_inputs)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\joeyj\Documents\Git\chaiNNer\backend\src\packages\chaiNNer_pytorch\pytorch\utility\convert_to_onnx.py", line 113, in convert_to_onnx_node
    verify_models(model, onnx_model_bytes, fp16)
  File "C:\Users\joeyj\Documents\Git\chaiNNer\backend\src\packages\chaiNNer_pytorch\pytorch\utility\convert_to_onnx.py", line 132, in verify_models
    onnx_out = session.run(
               ^^^^^^^^^^^^
  File "C:\Users\joeyj\AppData\Local\Programs\Python\Python311\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 220, in run
    return self._sess.run(output_names, input_feed, run_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Unexpected input data type. Actual: (tensor(float)) , expected: (tensor(float16))

@RunDevelopment
Copy link
Member Author

Okay, I think it should work now.

@joeyballentine
Copy link
Member

image
looks like it works properly when there's a mismatch

@joeyballentine joeyballentine merged commit e7cf591 into chaiNNer-org:main May 30, 2024
14 checks passed
@RunDevelopment RunDevelopment deleted the onnx-infer-test branch May 30, 2024 12:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Perform a similarity check + inference test after converting to onnx
2 participants