LLaMA-Factory icon indicating copy to clipboard operation
LLaMA-Factory copied to clipboard

Merge multiple LoRA by weights

Open authurlord opened this issue 1 year ago • 1 comments

Reminder

  • [X] I have read the README and searched the existing issues.

Reproduction

Thanks for the great work.

When merging the LoRA to base model by python src/export_model.py, I notice that --adapter_name_or_path can receive multiple LoRA weights and merge them by sequence.

Is there a solution or possibility, given a lora weights list containing multiple lora, and a weight list provided by user, e.g. ['adapter_1','adapter_2'] at weight [0,3,0,7], the export_model.py can merge these LoRA by user provided weights and methods? many thanks, a similar function is conducted by Peft https://huggingface.co/docs/peft/v0.7.1/en/package_reference/lora#peft.LoraModel.add_weighted_adapter.

非常感谢您的项目,当运行python src/export_model.py时,--adapter_name_or_path可以依顺序Merge多个adapter,那么有无可能按照Peft中add_weighted_adapter的方法,由用户指定LoRA list和weights,按照权重Merge这些LoRA?这将涉及到那些部分的修改?多谢,官方文档在上文已经提供

Expected behavior

No response

System Info

No response

Others

No response

authurlord avatar Dec 29 '23 17:12 authurlord

目前项目暂不支持,你可以在项目外部使用官方文档方法先 merge好weight, 然后再放到本项目里使用。

codemayq avatar Feb 20 '24 06:02 codemayq