lora icon indicating copy to clipboard operation
lora copied to clipboard

precompute then offload latents and text encodes as well as VAE and Text Encoder

Open Thomas-MMJ opened this issue 3 years ago • 1 comments

the latents and text vectors can be precomputed and then stored in RAM instead of VRAM, and then offload the text encoder and VAE if they aren't being trained, would allow for significant time and VRAM savings.

Thomas-MMJ avatar Dec 18 '22 01:12 Thomas-MMJ

This is great, I am currently refactoring the training script, I will use this trick in it

cloneofsimo avatar Dec 24 '22 02:12 cloneofsimo