Alpha is ignored
This is for bugs only
Did you already ask in the discord?
Yes You verified that this is a bug and not a feature request or question by asking in the discord?
Yes
Describe the bug
When setting linear alpha it still builds a 128 dim and 128 alpha network which is either a bug in the log or it ignores alpha value which would be a major bug.
"type": "sd_trainer", "training_folder": "output", "device": "cuda:0", "network": { "type": "lora", "linear": 128, "linear_alpha": 1, "network_kwargs": { "only_if_contains": [ "transformer.single_transformer_blocks.2.proj_out", "transformer.single_transformer_blocks.20.proj_out", "transformer.single_transformer_blocks.7.proj_out"
create LoRA network. base dim (rank): 128, alpha: 128
Same question happened to me.
For flux, alpha is currently ignored as it is saved in diffusers format which does not have an alpha.