VRT icon indicating copy to clipboard operation
VRT copied to clipboard

How much memory about GPUs?

Open focus1024-wind opened this issue 2 years ago • 3 comments

When I train on my host, it issue the error of out of memory. My host is configured with 4 x 24GB NVIDA3090 GPUs. I see your file that say you use 8 A100 GPUs to train the task. But there are two configurations about A100, so i want to konw concrete configuration about you GPUs or how much memory should I need. Looking forward to your help, thanks.

focus1024-wind avatar Jan 15 '23 10:01 focus1024-wind

I also used 3090 too, but it is still out of memory.

jqtangust avatar Apr 07 '23 11:04 jqtangust

I used 8 3090,but it is also out of memory.

Shengqi77 avatar May 09 '23 01:05 Shengqi77

It needs about 32GB

yyhtbs-yye avatar Apr 08 '24 11:04 yyhtbs-yye