sd-webui-controlnet
sd-webui-controlnet copied to clipboard
OOM error does not release GPU VRAM
If an API call is done and goes CUDA OOM, a lot of memory is left allocated when looking innvidia-smi
and does not seem to release it without restarting A1111.
Try using a lower model cache size may help
Try using a lower model cache size may help
Sorry I should have given more details. I am only using 1 model loaded ever. RTX 3090 in Linux.
I think this fixed this: https://github.com/Mikubill/sd-webui-controlnet/issues/347#issuecomment-1444647723 - Closing unless I can cause it to happen again.
Reopening actually. It still will not release any memory after an OOM error, after all. Looking for a hand here, please let me know if I can provide additional details!
Yes, the video memory will not be released at all. Even if I choose 0 cache models, it cannot be released.