Replies: 1 comment
-
ONNX-TensorRT and Torch-TensorRT are not 1:1. Likely there is more operations natively supported in ONNX-TensorRT right now but 1. TorchScript IR and ONNX are different and maybe the TorchScript version of your model can be supported by Torch-TensorRT easier. 2. Torch-TensorRT allows you to partially compile your model so operations that are not currently natively supported can still run in PyTorch while the parts that can be compiled to TensorRT will be run in TensorRT |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I used to convert a pytorch model to onnx format,and try to run it using native TensorRT SDK,but I failed for some operators in model is not supported by trt sdk; So if I use Torch-TensorRT to run the model, will I still have the same problem? Is there any more operators added compared to the native trt sdk?
Beta Was this translation helpful? Give feedback.
All reactions