You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However, many GPUs have memory that diverges slightly from the "GB" "rating". For example, a RTX3050 is advertised with 4GB but actually only has 3910MiB of memory. Similar things are true for many other cards. To ensure the user to be able to select a workspace size that reflects this reality, the workspace argument should accept memory sizes in MB
Use case
No response
Additional
No response
Are you willing to submit a PR?
Yes I'd like to help by submitting a PR!
The text was updated successfully, but these errors were encountered:
👋 Hello, this issue has been automatically marked as stale because it has not had recent activity. Please note it will be closed if no further activity occurs.
Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed!
Thank you for your contributions to YOLOv5 🚀 and Vision AI ⭐!
Search before asking
Description
Currently,
export.py
has aworkspace
argument to set the TensorRT workspace size in GB.yolov5/export.py
Line 220 in 596de6d
However, many GPUs have memory that diverges slightly from the "GB" "rating". For example, a RTX3050 is advertised with 4GB but actually only has 3910MiB of memory. Similar things are true for many other cards. To ensure the user to be able to select a workspace size that reflects this reality, the
workspace
argument should accept memory sizes in MBUse case
No response
Additional
No response
Are you willing to submit a PR?
The text was updated successfully, but these errors were encountered: