-
-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
what is the meaning of outputs? #1109
Comments
Hello @AliceSchaw, thank you for your interest in our work! Please visit our Custom Training Tutorial to get started, and see our Google Colab Notebook, Docker Image, and GCP Quickstart Guide for example environments. If this is a bug report, please provide screenshots and minimum viable code to reproduce your issue, otherwise we can not help you. |
@AliceSchaw classes and boxes. boxes are nx4 |
@glenn-jocher Could you elaborate on how to interpret these values? I find them confusing as they are different from what the PyTorch version would output |
@lostspirit0 these are designed for input into our iOS iDetection app. You can export the raw model outputs as well, search the repo issues, there are many solutions already posted. |
@glenn-jocher Could you give me a link, thank you? |
@glenn-jocher thanks for your reply. but i still do not understand the outputs, my input is 448X448, and strides are[32, 16, 8], so i guess the size of boundign box is (14 x 14 + 28 x 28 + 56 x 56) x 3 = 12348. but I want to get all the values that include the bbox of x, y, width, height and score, so, the expected size of outputs should be (14 x 14 + 28 x 28 + 56 x 56) x (4 + 1 + 4(class size)) * 3, but why is the size of outputs of the onnx 4 x 12348? |
onnx model may have multiple outputs, you might want to visualize with netron. |
@AliceSchaw ah perfect. Your outputs are right there. They're clearly labelled as 'boxes' and 'classes'. What seems to be the problem? |
here is official example: https://github.com/onnx/onnx-docker/blob/master/onnx-ecosystem/inference_demos/yoloV3_object_detection_onnxruntime_inference.ipynb It is obviously different between the official example's onnx model and ultralytics's one.
However, here exported model has 2 outputs, any code example about utilizing the ultralytics's exported onnx model to detect objects with onnx runtime? |
@AliceSchaw Ah I see. This repo's onnx integrates objectness and classification togethor, which is a common format used in mobile device detection, such as in our iOS iDetection app. I don't think anyone actually uses onnx to run models, it's mostly just an interchange format to send your model elsewhere, i..e coreml, tflite, etc. If you want you can try this: |
Thank you for your help and patience. |
Hello. Does the expression nx4 mean n * 4? Thank you. |
nx4 means shape = (n, 4) |
@leeyunhome it seems you have the wrong user, please refer to the follwing information:
|
I trained my custom dataset and exported the onnx model. the final names of output are 325 and 328, the type of them is float32[12348,4]. what is the meaning of outputs? my number of classes is 4.
onnx model:
cfg files:
prune_0.93_keep_0.01_16_shortcut_yolov3-spp3-4cls.txt
The text was updated successfully, but these errors were encountered: