peft
peft copied to clipboard
can i load two layer lora weights?
I want to first train a Lora weight on a relatively more general dataset, and then add a more segmented dataset based on this Lora weight, fine-tune a new Lora weight. Is that a possible way? thanks very much
Hello @songbaiTalk, yes, you can do it. Below is the sample code snippet (requires installation from main branch). let us know if that solves the issue:
model = PeftModel.from_pretrained(base_model, peft_model_name_or_path)
model = model.merge_and_unload() # this merges the lora weights into the base model and reverts the injects modules to get back the base model with lora weights added in.
config = LoraConfig()
new_peft_model = get_peft_model(model, config)
# train this as usual
# save the new lora weights
model.save_pretrained(path)
snippet
thanks a lot! i will try that
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.