koi icon indicating copy to clipboard operation
koi copied to clipboard

CUDA out of memories

Open Postconceptlab opened this issue 2 years ago • 1 comments

I have 6G VRAM card

how to fix this issue ? RuntimeError: CUDA out of memory. Tried to allocate 1024.00 MiB (GPU 0; 5.81 GiB total capacity; 3.14 GiB already allocated; 780.44 MiB free; 3.17 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

Postconceptlab avatar Sep 12 '22 09:09 Postconceptlab

i have create a 256X256 image

Postconceptlab avatar Sep 12 '22 10:09 Postconceptlab

Please ensure you are using the fp16 model pipeline in the backend server. If that still doesn't work you may need to close any additional programs that are taking up GPU memory.

You are right on the cusp of what the hardware this model is running on so you will need to take all the optimizations you can.

If you still run into issues i suggest using google colab as your backend--this should give you some more breathing room!

nousr avatar Sep 14 '22 17:09 nousr