LoftQ init without scaling
https://github.com/huggingface/peft/blob/f13d860e9f58e908ac85e78ac37528ec8c84eb99/src/peft/tuners/lora/layer.py#L226
I find scaling with pissa and dora,but no scaling with loftq init Lora_A and lora_B. It's a bug?or no need to do that
I agree, it looks like it could make sense to consider the scaling factor when applying LoftQ. @fxmeng could you comment on that?
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Gentle ping again @fxmeng
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Another ping @fxmeng.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.