peft
peft copied to clipboard
Training with Multiple LoRAs
Hi @BenjaminBossan and other contributors or maintainers, I would like to train one backbone model with two LoRAs like this:
class MyModel(nn.Module):
def __init__(...):
...
self.model = AutoModel.from_pretrained('...')
self.encoder = get_peft_model(self.model, lora_config)
self.decoder = get_peft_model(self.model, lora_config)
def forward(...):
hidden_states = self.encoder(...)
output = self.decoder(hidden_states, ...)
return output
However, I found both encoder and decoder share same LoRA during training. Is there any solutions?
Thanks!
However, I found both encoder and decoder share same LoRA during training.
Could you expand on what you mean by that? Maybe it would help to have a separate lora_config copy for each model, but as I don't know what the exact problem is, it's hard to tell.
Hi, thanks for your prompt reply!
What I expect is that encoder and decoder have different LoRA built upon the backbone model. However, it seems encoder and decoder are totally same and their LoRA parameters are shared.
BTW, what do you mean by "have a separate lora_config copy"?
Do you mean PEFT will detect whether current (base_model, lora_config) is stored in the set? If it exists in the set, the second get_peft_model will only retrieve but not create a new LoRA?
However, it seems encoder and decoder are totally same and their LoRA parameters are shared.
How does that manifest? Could you show what the encoder and decoder look like after initialization?
Do you mean PEFT will detect whether current
(base_model, lora_config)is stored in the set? If it exists in the set, the secondget_peft_modelwill only retrieve but not create a new LoRA?
No, but it could still be the case that the first model does some changes on the config instance that are then picked up by the second model. You should always create one config instance per model.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.