Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

min duration #239

Open
saareliad opened this issue Apr 10, 2022 · 2 comments
Open

min duration #239

saareliad opened this issue Apr 10, 2022 · 2 comments

Comments

@saareliad
Copy link

Hi, the rules show that min duration is 600 for all workloads (I was looking at datacenter) while it shall be 60 for most of them.
https://github.com/mlcommons/inference_policies/blob/master/inference_rules.adoc#3-scenarios

e.g,. looking at
https://github.com/mlcommons/inference_results_v2.0/search?q=60000
mlperf.conf shows 600, user.conf overrides it with 60, rules still show 600
official results page shows 60 as well.

@nv-ananjappa
Copy link
Contributor

@saareliad It should be 600. Could you point out one example file where it is 60?

@saareliad
Copy link
Author

First thing which came in the aforementioned search link:

https://github.com/mlcommons/inference_results_v2.0/blob/8fcd68065d54033cc0aa8e83d931907aacfb8c02/closed/GIGABYTE/measurements/GIGABYTE-G492-ID0_A100-SXM-80GBx8_TRT/bert-99/Server/user.conf

And there are a lot more like this + official results website

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants