ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

OutOfMemoryError - but still some free space ?

Open MoonMoon82 opened this issue 2 months ago • 6 comments

I'm using SDXL + openposeXL + IPAdapter and some loras which of course are taking a lot of VRAM. Often my workflows are trying to allocate more VRAM than my RTX3090 24GB actually has.

But I'm curious, because sometimes messages like these appear:

torch.cuda.OutOfMemoryError: Allocation on device 0 would exceed allowed memory. (out of memory)
Currently allocated     : 21.55 GiB
Requested               : 952.00 MiB
Device limit            : 24.00 GiB
Free (according to CUDA): 0 bytes
PyTorch limit (set by user-supplied memory fraction)
                        : 17179869184.00 GiB

Prompt executed in 188.48 seconds

What caught my attention is that the "Currently allocated" + "Requested" is still way less than the "Device limit" of 24GB. (I'm already running in lowvram mode)

Any ideas?

Thanks in advance and kind regards!

MoonMoon82 avatar Apr 17 '24 15:04 MoonMoon82

Your browser and other software you have open might be taking ~2GB vram.

comfyanonymous avatar Apr 17 '24 21:04 comfyanonymous

i can confirm that since the latest IPAdapter comfyUI seems to have memory issues, i cant process old bigger workflows anymore. it gets out of memory quick and i also run on 24GB VRAM RTX 4090. I found out especially the IPAdapter Tile is affected. but it could also be related to a comfyUI update. In March it all workd perfectly but after that its really bad.

donQx avatar Apr 22 '24 08:04 donQx

@comfyanonymous Yeah.. no... I don't think this could be the (main) reason. For example - I just had firefox with a comfyui- and github-tab open and I got this message:

torch.cuda.OutOfMemoryError: Allocation on device 0 would exceed allowed memory. (out of memory)
Currently allocated     : 18.37 GiB
Requested               : 960.00 MiB
Device limit            : 24.00 GiB
Free (according to CUDA): 0 bytes
PyTorch limit (set by user-supplied memory fraction)
                        : 17179869184.00 GiB

Prompt executed in 10.26 seconds

(I also had no other application running which could allocate the ~5.5GB VRAM)

So I had a look at the task manager GPU resources while ComfUI was still running right after the error message: grafik In total I had a difference of ~1GB to the "Currently allocated" value of the ComfyUI error message. After I closed ComfUI the task manager showed this: grafik And I assume, you're partially right - the left over value of 1GB could be allocated by the browser.

But even if I subtract the missing ~5.5GB VRAM - 1GB VRAM which are allocated by the browser, I still miss ~4,5GB of VRAM.

@donQx Yes, I also noticed it some time ago, that my workflows all of a sudden take more VRAM than before and even sometimes when I re-run the same workflow (with a different seed, or other input images) it allocates more VRAM than in the 1st run. Somethings not right here... idk.

MoonMoon82 avatar Apr 22 '24 13:04 MoonMoon82

And what really suprises me is that, if I'm running into this OutOfMemoryError message I just restart comfyUI and then the exact same workflow runs fine (at least for the 1st time/run).

MoonMoon82 avatar Apr 22 '24 14:04 MoonMoon82

I am having the OOM issue as well. Previously, I have no problems running DynamiCrafter on my 8gb vram GPU. Today, I tried running it and it kept saying OOM.

Of course, there is the likelihood that it might be due to the update in ComfyUI which I did it today. In this case, there is also no usage of IPAdapter.

edwinzeng2005 avatar Apr 25 '24 03:04 edwinzeng2005

I found out that a fresh installation of ComfyUI now uses torch 2.3.0, which has no compatible version of xformers, so the log complained that I had no xformers installed. So I downgraded torch to 2.2.2, torchvision to 0.17.2, and installed xformers 0.0.25.post1.

Now it works again without getting OOM.

edwinzeng2005 avatar Apr 25 '24 05:04 edwinzeng2005