latent-diffusion
latent-diffusion copied to clipboard
How to reduce the using GPU memory?
I use RTX 3080 Laptop with 16GB GPU memory to run rdm text-prompt only.I have changed the batch size from 3 to 1,but the CUDA is still out of memory.Please tell me how to solve it.Or whether RTX 3080 Laptop can run this model or not. Thanks very much!
haha, I use RTX 3090 to train an autoencoder, it also out of memory