Pablo Gonzalez
Pablo Gonzalez
I made another two runs, and these are the results I got. First, I ran the object detection benchmark with Inference 3.0 annotations and 2.1 code and I got: ```...
@arjunsuresh I think you can do that, but also keep in mind that the dimensions of the image `1366cde3b480a15c.jpg` were swapped as well. So that might also affect the results
Changes already merge in other PRs. They can be found here: https://github.com/mlcommons/inference/blob/c4a19872d9e7ba2fe2d5b8a4c3d3c02e82233785/tools/submission/submission_checker.py#L2035 and here (simplified version): https://github.com/mlcommons/inference/blob/c4a19872d9e7ba2fe2d5b8a4c3d3c02e82233785/tools/submission/generate_final_report.py#L52-L53
@maanug-nv I think for now the default value of the flag should be to not create such files. That way it wont mess any submission by accident, we can call...
Already have this version, but the error persist ``` Name: fbgemm-gpu-cpu Version: 0.3.2 Summary: Home-page: https://github.com/pytorch/fbgemm Author: FBGEMM Team Author-email: [email protected] License: BSD-3 Location: /opt/conda/lib/python3.7/site-packages Requires: Required-by: ```
@yuankuns When i try to remove the `fbgemm-gpu`, the following import error: ``` ModuleNotFoundError: No module named 'fbgemm_gpu' ``` I managed to run the cpu version with `fbgemm-gpu-cpu==0.3.2 fbgemm-gpu==0.4.1 pytorch==1.13.1`...
@yeandy Can you resolve the conflicts in order to merge this PR?
@nv-ananjappa @nvzhihanj @arjunsuresh @mrmhodak Proposal: Main modules: - Package checker: check that all the require files are there, check for forbidden files - Log checkers: - Performance checker: check the...
@arjunsuresh I tested it in my fork of the repo, the workflow seem to run, but I don't see the format changes pushed https://github.com/pgmpablo157321/inference/pull/5
Tested again and now it works fine in my fork