You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Unable to create an onnxruntime inference session from an onnx exported DDRNet-23-slim model on GPU. Can you provide some support related to this?
Getting the following error:
InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Invalid model. Node input '449' is not a graph input, initializer, or output of a previous node.
The text was updated successfully, but these errors were encountered:
Unable to create an onnxruntime inference session from an onnx exported DDRNet-23-slim model on GPU. Can you provide some support related to this?
Getting the following error: InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Invalid model. Node input '449' is not a graph input, initializer, or output of a previous node.
did you find a solution for this? It seems, that it has something to do with the inputs that were marked as optional in the source model structure.
Unable to create an onnxruntime inference session from an onnx exported DDRNet-23-slim model on GPU. Can you provide some support related to this?
Getting the following error:
InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Invalid model. Node input '449' is not a graph input, initializer, or output of a previous node.
The text was updated successfully, but these errors were encountered: