We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi, the rules show that min duration is 600 for all workloads (I was looking at datacenter) while it shall be 60 for most of them. https://github.com/mlcommons/inference_policies/blob/master/inference_rules.adoc#3-scenarios
e.g,. looking at https://github.com/mlcommons/inference_results_v2.0/search?q=60000 mlperf.conf shows 600, user.conf overrides it with 60, rules still show 600 official results page shows 60 as well.
The text was updated successfully, but these errors were encountered:
@saareliad It should be 600. Could you point out one example file where it is 60?
Sorry, something went wrong.
First thing which came in the aforementioned search link:
https://github.com/mlcommons/inference_results_v2.0/blob/8fcd68065d54033cc0aa8e83d931907aacfb8c02/closed/GIGABYTE/measurements/GIGABYTE-G492-ID0_A100-SXM-80GBx8_TRT/bert-99/Server/user.conf
And there are a lot more like this + official results website
No branches or pull requests
Hi, the rules show that min duration is 600 for all workloads (I was looking at datacenter) while it shall be 60 for most of them.
https://github.com/mlcommons/inference_policies/blob/master/inference_rules.adoc#3-scenarios
e.g,. looking at
https://github.com/mlcommons/inference_results_v2.0/search?q=60000
mlperf.conf shows 600, user.conf overrides it with 60, rules still show 600
official results page shows 60 as well.
The text was updated successfully, but these errors were encountered: