-
-
Notifications
You must be signed in to change notification settings - Fork 16.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Yet another export yolov5 models to ONNX and inference with TensorRT #1597
Comments
Hello @linghu8812, thank you for your interest in 🚀 YOLOv5! Please visit our ⭐️ Tutorials to get started, where you can find quickstart guides for simple tasks like Custom Data Training all the way to advanced concepts like Hyperparameter Evolution. If this is a 🐛 Bug Report, please provide screenshots and minimum viable code to reproduce your issue, otherwise we can not help you. If this is a custom training ❓ Question, please provide as much information as possible, including dataset images, training logs, screenshots, and a public link to online W&B logging if available. For business inquiries or professional support requests please visit https://www.ultralytics.com or email Glenn Jocher at [email protected]. RequirementsPython 3.8 or later with all requirements.txt dependencies installed, including $ pip install -r requirements.txt EnvironmentsYOLOv5 may be run in any of the following up-to-date verified environments (with all dependencies including CUDA/CUDNN, Python and PyTorch preinstalled):
StatusIf this badge is green, all YOLOv5 GitHub Actions Continuous Integration (CI) tests are currently passing. CI tests verify correct operation of YOLOv5 training (train.py), testing (test.py), inference (detect.py) and export (export.py) on MacOS, Windows, and Ubuntu every 24 hours and on every commit. |
@linghu8812 very nice! Did you have to configure onnxsim especially to achieve those simplifications or did it do them on its own? |
@glenn-jocher hello, first of all, onnx-simplifier need to be installed with
|
I tried this export script, but don't get the simplified layer like you showed up. My layer info of outputs is this: |
@al03 did the model simplify at all? I'm interested to see if it works, I'll try myself. |
@al03 @linghu8812 I get an error on import onnxsim, I'm not able to evaluate it. I used Will raise an issue on the onnxsim repo. EDIT: issue raised daquexian/onnx-simplifier#109 |
@al03 the |
@glenn-jocher you may install onnxruntime library first. |
@austingg yes it seems so according to daquexian/onnx-simplifier#109 (comment). I was interested in using onnxsim as part of the default code, but I looked at the install instructions (https://www.onnxruntime.ai/docs/get-started/install.html) and the prerequisites and OS-specific instructions appear too burdensome to include as part of the default repo. I think it would cause users more confusion/problems than it would solve. If this is a common use case though (and it seems it may be) it might make sense to place these instructions within a Tutorial that we could add to https://docs.ultralytics.com/yolov5. That way expert users could still benefit. |
@glenn-jocher I have updated onnx-simplifier to v0.2.26 so that it depends on onnxruntime-noopenmp instead of onnxruntime according to microsoft/onnxruntime#6511. I believe all instructions in https://www.onnxruntime.ai/docs/get-started/install.html is not indeed needed if we don't depend on openmp, and onnx-simplifier will work like a charm without any additional instructions. Could you please give onnx-simplifier a try? :D |
@daquexian oh really? I actually gave up on the process before after seeing the complicated dependency requirements. So what exactly are the pip installs required now to use onnx-simplifier? It would be nice to integrate it into export.py if we can get simple dependencies and the installs all pass the CI checks on the 3 main OS's. |
@daquexian the current CI checks do an ONNX export BTW here: The export tests are defined here: yolov5/.github/workflows/ci-testing.yml Line 79 in be9edff
Failures in an export won't fail the CI as they are in try except clauses, but it provides nice realtime insight into whether they are working or not, since the tests run every day. |
Just update https://github.com/ultralytics/yolov5/blob/master/.github/workflows/ci-testing.yml#L51 with |
@daquexian I was able to install in Colab, but not locally on macos for some reason. Python 3.9 appears incompatible, so I tried with a 3.8.0 environment, but got this. Do you know what the issue might be? |
@glenn-jocher Oh it's my fault.. onnxruntime-noopenmp doesn't have a macos version. That's so strange. I'll update this issue when onnxruntime has better macos support. |
|
linghu8812/tensorrt_inference#42 It is supported yolov5 4.0, yolov5s6.pt models and so on to inference with TensorRT now! for yolov5s6, just run with
|
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
@linghu8812 , I converted the yolov5s model to onnx model following this . I wish to know "the output nodes before 5D Reshape". How can i get these output nodes? Kindly help me to know these output nodes. |
@PiyalGeorge I'm not sure exactly what you mean by the output nodes before 5D reshape, though the 5D reshape is in the Detect layer, so what you are looking for is probably there. Lines 24 to 58 in 5f7d39f
On a side note, onnx-simplifier is now integrated with YOLOv5 export via PR #2815, you can access it like this in the latest code:
|
@glenn-jocher Detect Layer still cannot be exported to onnx ? |
@bertinma yes Detect() layer exports to ONNX with export.py. |
Hi, may I ask how you get the last output layer? (1,25200,85) |
@glenn-jocher |
@tsangz189 --grid forms the single output:
|
Hello everyone, here is a repo that can convert the yolov5 model to ONNX model and inference with TensorRT. The code is here: https://github.com/linghu8812/tensorrt_inference/tree/master/project/yolov5. It supported all yolov5 models including yolov5s, yolov5m, yolov5l and yolov5x. An onnxsim module has been imported to simplify the yolov5 structure. Before simplify the yolov5 onnx structure was shown like this:
after simplified the onnx model has been simplified to:
some extra node have been simplified.
In addition, a TensorRT inference code has also been supplied, the inference result has been shown below:
The text was updated successfully, but these errors were encountered: