Unrecognized keys in `rope_scaling` for 'rope_type'='dynamic': {'type'}
System Info
version 4.44.1
Who can help?
@ArthurZucker
Information
- [ ] The official example scripts
- [X] My own modified scripts
Tasks
- [ ] An officially supported task in the
examplesfolder (such as GLUE/SQuAD, ...) - [X] My own task or dataset (give details below)
Reproduction
Unrecognized keys in rope_scaling for 'rope_type'='dynamic': {'type'}
Expected behavior
no error, version 4.42.4 is OK, but 4.44.1 raise this error message, config:
"rope_scaling": {
"factor": 3.0,
"type": "dynamic"
}
There has been an update to how rope_scaling key is handled. When llama3.1 came out, transformers refactored RoPE logic https://github.com/huggingface/transformers/pull/32135
Indeed, tho we did try to make things backward compatible. Do you have a reproducer?
FWIW - I see it when loading a vicuna models.
>>> model = AutoModelForCausalLM.from_pretrained('lmsys/longchat-7b-v1.5-32k', do_sample=True)
Unrecognized keys in `rope_scaling` for 'rope_type'='linear': {'type'}
Loading checkpoint shards: 100%
>>>
model = AutoModelForCausalLM.from_pretrained('lmsys/vicuna-7b-v1.5-16k', do_sample=True)
Unrecognized keys in `rope_scaling` for 'rope_type'='linear': {'type'}
Loading checkpoint shards: 100%
>>>
Yep I can reproduce. cc @gante
Yup, can reproduce it too. It is a harmless warning (no impact on model usage), but opening a PR for it