diff --git a/submission_rules.adoc b/submission_rules.adoc index b485d40..aeb0e25 100644 --- a/submission_rules.adoc +++ b/submission_rules.adoc @@ -327,10 +327,9 @@ System names and implementation names may be arbitrary. Here is the list of mandatory files for all submissions in any division/category. However, your submission should still include all software information and related information for results replication. - * mlperf_log_summary.txt * mlperf_log_detail.txt -* Mlperf_log_accuracy.json [ TODO: handle differently in v0.6?] +* mlperf_log_accuracy.json * user.conf * calibration or weight transformation related code if the original MLPerf models are not used * actual models if the models are not deterministically generated @@ -340,6 +339,16 @@ Here is the list of mandatory files for all submissions in any division/category * .json * compliance_checker_log.txt +For some models mlperf_log_accuracy.json can get very large. Because of this we truncate mlperf_log_accuracy.log in submissions +using a tool. +A submiter will run the tool before submitting to mlperf and ***keep*** the original mlperf_log_accuracy.log files inside their organization. +The original files might be requested by mlperf during submission review so you need to store them. +Run the tool as follows, assuming is your local subumission tree and the location of the github submission repo: + +``` +# from top of the inference source tree +python3 tools/submission/truncate_accuracy_log.py --input --output +``` ### .json metadata @@ -471,7 +480,7 @@ This section in progress [TODO]. #### Inference -Refer to the documentation found under https://github.com/mlperf/inference/tree/master/v0.7/compliance/nvidia +Refer to the documentation found under https://github.com/mlperf/inference/tree/master/compliance/nvidia ## Review