-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[C++] How to predict fast? #2118
Comments
Your model must support "batching" inherently and then you can feed in a "batched" input. For example, if your model has graph input (taking image case) of shape [1, 3, 224, 224] - it can only take one 3 channel image of height and width 224. But if it has graph input like this - ["some_string", 3, 224, 224] - it can handle "batched" input of any size. Hope this helps. |
[C++] after session_ load aa_batch.onnx. |
Can it also speed up by simply modifying the model to dynamic input, without modifying the output node, and not batching the input data? Hope to hear from you. |
This is really slow,is there some method make this faster (using C++)?
The text was updated successfully, but these errors were encountered: