inference icon indicating copy to clipboard operation
inference copied to clipboard

How to run text-to-image with multi GPUs

Open surbanqq opened this issue 1 year ago • 2 comments

Hello everyone, I am a newcomer to MLPerf. I would like to know whether the text-to-image in inference supports multi-card testing. Currently, I see that there is no parameter to set multi-card in the parameters of " python main.py --help ". and I saw the MLPerf inference Results, it has results of 2 L40s and 8 H100 . how do they run the test? thanks a lot

surbanqq avatar Sep 14 '24 08:09 surbanqq

Hi @surbanqq! Reference code often supports only a single accelerator. But for their submissions vendors optimize including scaling to multiple accelerators. In the case of NVIDIA, please take a look at their v4.1 submission.

psyhtest avatar Sep 17 '24 16:09 psyhtest

thanks a lot

surbanqq avatar Sep 18 '24 01:09 surbanqq