Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Perform a similarity check + inference test after converting to onnx #2585

Closed
joeyballentine opened this issue Feb 15, 2024 · 2 comments · Fixed by #2915
Closed

Perform a similarity check + inference test after converting to onnx #2585

joeyballentine opened this issue Feb 15, 2024 · 2 comments · Fixed by #2915

Comments

@joeyballentine
Copy link
Member

The main con of this would be that then onnx inference would be required for conversion. Maybe it could just be an optional thing if onnx is installed.

Why is this necessary? According to musl:

[Check] If the converted onnx is relatively close to the outputs of pytorch inference. This is not strictly necessary, but the documentation recommends it (np.testing.assert_allclose). It obviously won't change the model, but I found some situations where the converted onnx gave like 20% difference between pytorch and onnx, which should not be acceptable. So if this is not tested with assert_allclose, the outputs could be bad while still exporting just fine.

@Kim2091
Copy link
Collaborator

Kim2091 commented Apr 21, 2024

Is there any update on this? It'd be great to have. Some users exclusively use neosr's onnx conversion script now as it verifies conversion whereas chaiNNer does not

@RunDevelopment
Copy link
Member

https://github.com/muslll/neosr/blob/d871e468fa9e82dc874f65b2289ad5726a5ab3b9/convert.py#L61

This is neosr's conversion btw. Should be easy to implement a similar check.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants