IF
IF copied to clipboard
CUDA out of memory.
I have tried to run: this code, which causes my model to be loaded into VRAM, so when I run the notebook I get "CUDA out of memory." How can I reverse this code and remove it from VRAM?
by changing device = "cuda:0"
to device = "cpu"
you can run the code on cpu without using VRAM, however that is going to be extremely slow. I recommend using the example code in the README.md using diffusers instead, as the cpu model offload significantly reduces VRAM requirements. If that isn't enough, you can try some of the options here to further reduce VRAM usage. if you need a pre-made notebook, you can find one on colab, though this might need modification for lower VRAM instances
T5 needs about 11.6GiB If_I needs about 9.2GiB if_II + if_III need about 5.8GiB, each separately about 3GiB