xTuring icon indicating copy to clipboard operation
xTuring copied to clipboard

RecursionError: maximum recursion depth exceeded while calling a Python object

Open ForAxel opened this issue 1 year ago • 0 comments

My System Info

Python 3.10.15 torch 2.4.1 transformers 4.31.0 xturing 0.1.8 sentencepiece 0.1.99

When I loaded model model = GenericLoraKbitModel('aleksickx/llama-7b-hf') in examples/features/int4_finetuning/LLaMA_lora_int4.ipynb, I got the following error message

RecursionError: maximum recursion depth exceeded while calling a Python object

According to the issue mentioned in https://github.com/huggingface/transformers/issues/22762 , it seems that the tokenizer of llama-7b-hf is not compatible with the latest version of transformer?

Can you provide the version of transformers(or other libs) that can successfully run the code in your notebook?

ForAxel avatar Oct 16 '24 03:10 ForAxel