optimum
optimum copied to clipboard
Failed to reserver PEFT model "PeftModelForCausalLM.__init__() missing 1 required positional argument: 'peft_config'"
System Info
optimum==1.14.1
peft==0.6.1
pytorch-lightning==2.1.0
pytorch-pretrained-bert==0.6.2
torch==2.0.0+cu118
torchaudio==2.0.0+cu118
torchmetrics==1.2.0
torchvision==0.15.1+cu118
Who can help?
No response
Information
- [ ] The official example scripts
- [X] My own modified scripts
Tasks
- [ ] An officially supported task in the
examples
folder (such as GLUE/SQuAD, ...) - [X] My own task or dataset (give details below)
Reproduction (minimal, reproducible, runnable)
Model initialization
model = AutoModelForCausalLM.from_pretrained(...) # original HF model
model = model.to_bettertransformer() # convert to use optimum library
model = get_peft_model(model, lora_config) # convert to use LoRA
...
Model save
unwrapped_model = BetterTransformer.reverse(unwrapped_model)
Failure message
2023-11-14, 14:48:00.033
TypeError: PeftModelForCausalLM.__init__() missing 1 required positional argument: 'peft_config'
2023-11-14, 14:48:00.033
reversed_model = bt_model.__class__(config)
2023-11-14, 14:48:00.033
File "/home/default_user/.conda/envs/user/lib/python3.10/site-packages/optimum/bettertransformer/transformation.py", line 340, in reverse
2023-11-14, 14:48:00.033
unwrapped_model = BetterTransformer.reverse(unwrapped_model)
Expected behavior
BetterTransformer should be able to restore the original PEFT model.