Inference Submission: accuracy.txt/.json
The rules call for "accuracy/mlperf_log_accuracy.json" while the submission script calls for "accuracy/accuracy.txt"
Yes, we check for accuracy/mlperf_log_accuracy.json as well and expect the output of the accuracy script in accuracy/accuracy.txt. The checker should issue a message to run the accuracy scripts to create it. The user would need to run those scripts anyway to know the accuracy. We did not run the scripts inside the checker to avoid the need to have the datasets passed to the checker.
Currently, the checker asks to have mlperf_log_accuracy.json for performance runs as well. Is this intended?
loadgen creates one. It is currently empty. But since loadgen reserves the right to randomly take a sample of the results even in perf runs I think it should be included.