peft icon indicating copy to clipboard operation
peft copied to clipboard

model merge_and_unload do not support layer_replication

Open CrazyBoyM opened this issue 1 year ago • 5 comments

System Info

when you trained a model with layer_replication in LoRAConfig,you will find that can not be merge to the base model in a right way

Who can help?

No response

Information

  • [x] The official example scripts
  • [x] My own modified scripts

Tasks

  • [x] An officially supported task in the examples folder
  • [x] My own task or dataset (give details below)

Reproduction

just set layer_replication in LoraConfig, train a sample lora and merge it to the base model

Expected behavior

generate a modeling_config.py script that can work properly with "layer_replication"

CrazyBoyM avatar May 03 '24 17:05 CrazyBoyM

This is not easily possible. The reason is that those replicated layers share the underlying base weights between multiple layers. Therefore, we cannot merge LoRA weights, as different LoRA weights would be merged into the base weights, resulting in incorrect outputs.

BenjaminBossan avatar May 06 '24 09:05 BenjaminBossan

what I mean is when I create a lora with layer_replication for expanding  blocks from 22 to 32,I merge and save it,but when I load the output model,I find there are only 22 blocks in the final model.

---Original--- From: "Benjamin @.> Date: Mon, May 6, 2024 17:54 PM To: @.>; Cc: "Xinlu @.@.>; Subject: Re: [huggingface/peft] model merge_and_unload do not supportlayer_replication (Issue #1707)

This is not easily possible. The reason is that those replicated layers share the underlying base weights between multiple layers. Therefore, we cannot merge LoRA weights, as different LoRA weights would be merged into the base weights, resulting in incorrect outputs.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

CrazyBoyM avatar May 06 '24 10:05 CrazyBoyM

What I mean is when I create a lora with layer_replication for expanding blocks from 22 to 32, I merge and save it, but when I load the output model, I find there are only 22 blocks in the final model.

As mentioned, merging with layer replication isn't really possible.

Also, when you load the model, make sure that you first load the base model, then the LoRA adapter using PeftModel.from_pretrained(...). This should restore the replicated layers.

BenjaminBossan avatar May 06 '24 10:05 BenjaminBossan

sorry,what I mean is that I want to load the base model and the lora model then merge_and_unload,get a  new 1.5B model with 32block,not a original 1B model with 22block.

---Original--- From: "Benjamin @.> Date: Mon, May 6, 2024 18:27 PM To: @.>; Cc: "Xinlu @.@.>; Subject: Re: [huggingface/peft] model merge_and_unload do not supportlayer_replication (Issue #1707)

What I mean is when I create a lora with layer_replication for expanding blocks from 22 to 32, I merge and save it, but when I load the output model, I find there are only 22 blocks in the final model.

As mentioned, merging with layer replication isn't really possible.

Also, when you load the model, make sure that you first load the base model, then the LoRA adapter using PeftModel.from_pretrained(...). This should restore the replicated layers.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

CrazyBoyM avatar May 06 '24 10:05 CrazyBoyM

what I mean is that I want to load the base model and the lora model then merge_and_unload,get a new 1.5B model with 32 block,not a original 1B model with 22 block.

This is not really an option right now with PEFT. I guess what you could try is to create clones of the weights that are currently being shared, edit the adapter_config.json to remove the layer_replication entry, then try to load the LoRA adapter and try if merge_and_onload works.

BenjaminBossan avatar May 06 '24 10:05 BenjaminBossan

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

github-actions[bot] avatar Jun 03 '24 15:06 github-actions[bot]