diffusers
diffusers copied to clipboard
[flux dreambooth lora training] make LoRA target modules configurable + small bug fix
new feature for the Flux dreambooth lora training script:
-
make LoRA target modules configurable through
--lora_blocks
-
change the current default target modules to not be attention layers only (?)
& small fix to mixed precision training for dreambooth script, as proposed in https://github.com/huggingface/diffusers/pull/9565