peft icon indicating copy to clipboard operation
peft copied to clipboard

can i load two layer lora weights?

Open songbaiTalk opened this issue 1 year ago • 2 comments

I want to first train a Lora weight on a relatively more general dataset, and then add a more segmented dataset based on this Lora weight, fine-tune a new Lora weight. Is that a possible way? thanks very much

songbaiTalk avatar Apr 04 '23 13:04 songbaiTalk

Hello @songbaiTalk, yes, you can do it. Below is the sample code snippet (requires installation from main branch). let us know if that solves the issue:

model = PeftModel.from_pretrained(base_model, peft_model_name_or_path)
model = model.merge_and_unload() # this merges the lora weights into the base model and reverts the injects modules to get back the base model with lora weights added in.

config = LoraConfig()
new_peft_model = get_peft_model(model, config)

# train this as usual

# save the new lora weights
model.save_pretrained(path)

pacman100 avatar Apr 04 '23 20:04 pacman100

snippet

thanks a lot! i will try that

songbaiTalk avatar Apr 05 '23 09:04 songbaiTalk

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

github-actions[bot] avatar May 04 '23 15:05 github-actions[bot]