transformers icon indicating copy to clipboard operation
transformers copied to clipboard

Unrecognized keys in `rope_scaling` for 'rope_type'='dynamic': {'type'}

Open DefTruth opened this issue 1 year ago • 1 comments

System Info

version 4.44.1

Who can help?

@ArthurZucker

Information

  • [ ] The official example scripts
  • [X] My own modified scripts

Tasks

  • [ ] An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • [X] My own task or dataset (give details below)

Reproduction

Unrecognized keys in rope_scaling for 'rope_type'='dynamic': {'type'}

Expected behavior

no error, version 4.42.4 is OK, but 4.44.1 raise this error message, config:

"rope_scaling": {
    "factor": 3.0,
    "type": "dynamic"
  }

DefTruth avatar Aug 21 '24 12:08 DefTruth

There has been an update to how rope_scaling key is handled. When llama3.1 came out, transformers refactored RoPE logic https://github.com/huggingface/transformers/pull/32135

thepowerfuldeez avatar Aug 22 '24 10:08 thepowerfuldeez

Indeed, tho we did try to make things backward compatible. Do you have a reproducer?

ArthurZucker avatar Aug 22 '24 14:08 ArthurZucker

FWIW - I see it when loading a vicuna models.

>>> model = AutoModelForCausalLM.from_pretrained('lmsys/longchat-7b-v1.5-32k', do_sample=True)
Unrecognized keys in `rope_scaling` for 'rope_type'='linear': {'type'}
Loading checkpoint shards: 100%
>>> 
model = AutoModelForCausalLM.from_pretrained('lmsys/vicuna-7b-v1.5-16k', do_sample=True)
Unrecognized keys in `rope_scaling` for 'rope_type'='linear': {'type'}
Loading checkpoint shards:   100%
>>> 

mneilly avatar Aug 30 '24 23:08 mneilly

Yep I can reproduce. cc @gante

ArthurZucker avatar Sep 02 '24 08:09 ArthurZucker

Yup, can reproduce it too. It is a harmless warning (no impact on model usage), but opening a PR for it

gante avatar Sep 05 '24 16:09 gante