-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Example of batch inferenece #1944
Comments
The model doesn't support batch inference. Please try another model. You may pick one from:https://github.com/mlperf/inference/tree/master/v0.5/classification_and_detection |
Hi @abduld , Expanding on the answer, if you open-up the model using Netron (or any other tool) and examine the graph inputs, usually a model that natively supports "batching" has symbolic dimensions in the graph input shape descriptions. For example, like this - A model that doesn't usually has the input shape fixed like [1, 224, 224, 3]. Closing this issue as this is not an issue for ORT. Pleas re-open for further clarifications. |
I am trying to find an example of performing batched inference. Currenyly, I am trying shufflenet (in the onnx model zoo), but I get the following
Any help is greattly appreciated
The text was updated successfully, but these errors were encountered: