peft icon indicating copy to clipboard operation
peft copied to clipboard

load a model with unmerged lora adapters

Open Ledzy opened this issue 7 months ago • 3 comments

Feature request

For some reason, I need to train a model's both LoRA modules and the original weights. The model is saved periodically during training. However, if i directly save the model with unmerged lora adapters, i cannot load it since the from_pretrained method seems to only support: 1) load a pretrained model and then load adapters, or 2) load a checkpoint that has the same parameters as pretrained model (and thereby cannot have any LoRA module). Neither cases match my need.

Therefore, I am wondering if peft/transformers can support loading a model with unmerged lora adapters?

Motivation

This option seems very natural and is useful for research.

Your contribution

I am actively working on this feature, yet haven't figure out a clean approach. I would create a pr once I finish it. Any assistance or suggestion is appreciated.

Ledzy avatar Jul 18 '24 02:07 Ledzy