Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support multiple batches for exporting ONNX with pre-processing #320

Merged
merged 5 commits into from
Feb 16, 2022

Conversation

zhiqwang
Copy link
Owner

@zhiqwang zhiqwang commented Feb 16, 2022

The batch-size argument / parameter is only used for models that include pre-processing. You need to specify the batch sizes and ensure that the number of input images is the same as the batches when inferring if you want to export multiple batches ONNX models.

@zhiqwang zhiqwang added the enhancement New feature or request label Feb 16, 2022
@zhiqwang zhiqwang added the deployment Inference acceleration for production label Feb 16, 2022
@codecov
Copy link

codecov bot commented Feb 16, 2022

Codecov Report

Merging #320 (2e0ffe3) into main (61a6e62) will increase coverage by 0.27%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #320      +/-   ##
==========================================
+ Coverage   95.10%   95.38%   +0.27%     
==========================================
  Files          11       11              
  Lines         736      759      +23     
==========================================
+ Hits          700      724      +24     
+ Misses         36       35       -1     
Flag Coverage Δ
unittests 95.38% <100.00%> (+0.27%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
test/test_runtime_ort.py 98.61% <100.00%> (+2.69%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 61a6e62...2e0ffe3. Read the comment docs.

@zhiqwang zhiqwang merged commit 724f60e into main Feb 16, 2022
@zhiqwang zhiqwang deleted the onnx-export-batch branch February 16, 2022 05:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
deployment Inference acceleration for production enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant