PuLID icon indicating copy to clipboard operation
PuLID copied to clipboard

Is 24G VRAM not enough? torch.OutOfMemoryError: CUDA out of memory

Open fahadshery opened this issue 4 months ago • 4 comments

Hi,

I have a 24G VRAM Nvidia Tesla P40 card. I am getting the following error:

torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 144.00 MiB. GPU 0 has a total capacity of 23.87 GiB of which 138.62 MiB is free. Process 13728 has 23.73 GiB memory in use. Of the allocated memory 23.55 GiB is allocated by PyTorch, and 18.42 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation.  See documentation for Memory Management  (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)

Any solutions?

fahadshery avatar Oct 07 '24 13:10 fahadshery