OOTDiffusion
OOTDiffusion copied to clipboard
torch.cuda out of memory
Thanks for Dr.Xu relplying to my previous question, i can inference your briliiant model locally,but i have another quesetion below:
my type of GPU is 3060Laptop,and my inference needs about 25minutes and sometimes get the result 'CUDA out of memory',but i have seen others release installation package with .bat which can directly run inferencing on bilibili. I try his .bat and find it can inference in nearly half of minutes and get a great result,that makes me really wondering
i wonder how can i solve it or the model really needs a higher-capability gpu to make inference
Hi. It takes around 3 seconds and 6GB memory for 1 sample and 20 steps on our RTX 4090 GPU. Maybe a 6GB 3060 is not completely enough.
Hi. It takes around 3 seconds and 6GB memory for 1 sample and 20 steps on our RTX 4090 GPU. Maybe a 6GB 3060 is not completely enough.
I use 3060Ti 16G , same problem