peft
peft copied to clipboard
Incorrect target modules does not error
System Info
peft 0.12.0 transformers 4.43.1
Who can help?
No response
Information
- [ ] The official example scripts
- [ ] My own modified scripts
Tasks
- [ ] An officially supported task in the
examples
folder - [ ] My own task or dataset (give details below)
Reproduction
from transformers import AutoModelForCausalLM, AutoConfig
from peft import LoraConfig, get_peft_model
config = AutoConfig.from_pretrained('meta-llama/Meta-Llama-3-8B', use_auth_token=False)
config.num_hidden_layers = 2
model = AutoModelForCausalLM.from_config(config)
peft_config = LoraConfig(
target_modules=[
'q_proj',
'FAKE_TARGET_MODULE',
]
)
get_peft_model(model, peft_config)
Expected behavior
I expect it to error because of FAKE_TARGET_MODULE. However the code here only errors if none of the target modules are found. Thanks!