VRT
VRT copied to clipboard
How much memory about GPUs?
When I train on my host, it issue the error of out of memory. My host is configured with 4 x 24GB NVIDA3090 GPUs. I see your file that say you use 8 A100 GPUs to train the task. But there are two configurations about A100, so i want to konw concrete configuration about you GPUs or how much memory should I need. Looking forward to your help, thanks.
I also used 3090 too, but it is still out of memory.
I used 8 3090,but it is also out of memory.
It needs about 32GB