stable-diffusion-webui
stable-diffusion-webui copied to clipboard
[Feature Request]: Add control to which models need to loaded to GPU VRAM
Is there an existing issue for this?
- [X] I have searched the existing issues and checked the recent builds/commits
What would your feature do ?
When using CLIP it will load whole model into GPU VRAM and hold it here... I am have 6 GB of GPU VRAM and well cannot do anything after it generated a promt for me. It would be slow to load CLIP to memory and from it every time I use it, but it will work at least.
After using CLIP doing anything throws CUDA out of memory.. =(
Solution : add support for loading out models from memory.
Proposed workflow
Something like drop down list close to models list that will show all loaded models that currently stored in GPU VRAM and options to remove some of them. By doing this I will have a full control over what occupies my GPU memory.
If you use lazy initialization for accessing models it is straight forward to load out model from VRAM, reset lazy initializers to any model and well done.
Additional information
No response
By the way the same problem with PC VRAM (not gpu). When I use multiple diffusers it loads all of them into VRAM of pc(thanks it does not hold them all this way in GPU VRAM). So with my 12 GB of VRAM I can only store 2-3 of them thanks to swap. So it would make sense to add such slider for PC VRAM as well.
Settings/Interrogate Options/ Uncheck this one.