Dreambooth-Stable-Diffusion
Dreambooth-Stable-Diffusion copied to clipboard
support gtx3090?
from readme "You can now run this on a GPU with 24GB of VRAM (e.g. 3090)."
but i use the local 3090 to train and get the error of exceeding the VRAM.
24GB of VRAM should be enough. Potentially there are other processes that are taking up RAM. Try killing all other processes.
ps aux
to see all other processes and kill -9 <process id>
for any processes that you don't need to be running. This'll help you free up RAM
If you're using runpod.io see if there are any processes labeled webUI or relauncher.
For runpod you can avoid this completely if you don't use there stable diffusion and just use the PyTorch instead.
For those on runpod either do what's suggested above or connect to a web terminal and kill those two processes:
sh -c python webui.py --port 3000 --ckpt /workspace/stable-diffusion-webui/v1-5-pruned-emaonl...
python webui.py --port 3000 --ckpt /workspace/stable-diffusion-webui/v1-5-pruned-emaonly.ckpt...