-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loading TorchScript model fails for Triton in DeepStream #2317
Comments
Please share your model repository structure. It looks like instead of using numeric versions for the model you directly placed 'mymodel' in the 'model_directory'. Please refer to the instructions here and reopen if needed. |
@CoderHam I mount my model repo to My model repo looks like this:
If it helps, my DeepStream config is
(Also, I am unable to re-open this issue as I am not a collaborator on this repo, so I do hope you see this) |
Could you try running the model directly inside |
@rbrigden Since this appears to be a DeepStream-related issue, I've spoken to their team and they recommended that you post your issue on the Deepstream SDK Board: |
Thank you @CoderHam and @msalehiNV, I haven't yet had a chance to test on the standalone triton server, but will do that soon. I'll post an update here and also well as make a post on the Deepstream SDK Board. |
Description
I am trying to load a successfully exported TorchScript model in the Triton inference server that is packed with DeepStream 5.0. Unfortunately I receive this error:
Issues filed in pytorch seem to mismatched pytorch versions (between the export and runtime): example
The Pytorch/Torchvision environments in my training / export environment are:
I am using the NGC container
Based on the framework support matrix, it appears that
20.09
support PyTorch 1.7.0.Triton Information
What version of Triton are you using?
Are you using the Triton container or did you build it yourself?
I am using the NGC container
To Reproduce
Steps to reproduce the behavior.
Export a TorchScript model using PyTorch 1.7.0 and adapt the triton sample in the container
nvcr.io/nvidia/deepstream:5.0.1-20.09-triton
/opt/nvidia/deepstream/deepstream-5.0/sources/python/apps/deepstream-ssd-parserDescribe the models (framework, inputs, outputs), ideally include the model configuration file (if using an ensemble include the model configuration file for that as well).
The text was updated successfully, but these errors were encountered: