inference icon indicating copy to clipboard operation
inference copied to clipboard

Submission checker error with 3d-unet

Open rnaidu02 opened this issue 2 years ago • 3 comments

Got the following error when run submission checker.

[2023-02-08 05:37:40,235 submission_checker.py:1645 ERROR] closed/Intel/results/1-node-2S-SPR-PyTorch-INT8/3d-unet-99.9/Offline/performance/run_1/mlperf_log_detail.txt performance_sample_count, found 0, needs to be >= 43 [2023-02-08 05:37:40,235 submission_checker.py:1696 INFO] Target latency: None, Latency: 14079230289064, Scenario: Offline [2023-02-08 05:37:40,235 submission_checker.py:2336 ERROR] closed/Intel/results/1-node-2S-SPR-PyTorch-INT8/3d-unet-99.9/Offline/performance/run_1 has issues

While in mlperf.conf https://github.com/mlcommons/inference/blob/c4a19872d9e7ba2fe2d5b8a4c3d3c02e82233785/mlperf.conf#L14 performance_sample_count_override is 0. What is the correct performance_sample_count_override for 3dunet in 3.0?

set to 0 to let entire sample set to be performance sample

3d-unet.*.performance_sample_count_override = 0

rnaidu02 avatar Feb 10 '23 03:02 rnaidu02

This error seems very weird, since if you set performance_sample_count_override, the performance_sample_count should be the size of the dataset. https://github.com/mlcommons/inference/blob/c4a19872d9e7ba2fe2d5b8a4c3d3c02e82233785/loadgen/test_settings_internal.cc#L117-L119

For 3d-unet this should be 43. You could try to set this value to 43 and see if this solves your issue, but it should work with 0.

pgmpablo157321 avatar Feb 10 '23 16:02 pgmpablo157321

Any updates on this issue?

najeeb5 avatar Feb 23 '23 13:02 najeeb5

@rnaidu02 Can this be closed?

nv-ananjappa avatar Apr 18 '23 22:04 nv-ananjappa