fast-stable-diffusion
fast-stable-diffusion copied to clipboard
DreamBooth and DeepSpeed
There was a reddit thread claiming to run dreambooth under 8GB VRAM using DeepSpeed, they also shared their code. I don't see anyone in the replies reporting success and I also failed to make it work. Maybe you or someone in the community can figure it out and we can lower VRAM requirements further. Here is my test notebook for paperspaces if you want to check it out.
I saw that, it requires at least 25GB of RAM which is inconvenient for Google Colab since it offers 16GB GPUs and 12GB of RAM
Yes, it's not suitable for colab but paperspace offers 8GB VRAM and 30GB RAM on their lowest free tier machines. Almost like a perfect candidate for this. Their running times are not as restricted. This might be a good way to relieve some pressure from colab.
datacrunch offers 80GB gpus, but it's jupyterlabs, been tying to convert colab to jupyter but wrestling with dependencies
I have tried this out myself, 8 GB GPU and 48 GB RAM, and most others that have tried it have had no success. It will always OOM. Even on a 16 GB card with 64 GB RAM people are running into OOM.