LLaMA-LoRA-Tuner icon indicating copy to clipboard operation
LLaMA-LoRA-Tuner copied to clipboard

TypeError: __init__() got an unexpected keyword argument 'llm_int8_skip_modules'

Open AlexeiKaDev opened this issue 1 year ago • 0 comments

Hi. i try to train localy with my RTX3060 on windows 10. Can somebody help me with this erorr?

I think i did this steps to start it works with cuda

python -m venv lora
.\lora\Scripts\activate

pip install -r requirements.lock.txt

pip install pynvml==11.0.0

pip uninstall bitsandbytes

pip install R:/llama/bitsandbytes-0.41.2.post2-py3-none-win_amd64.whl  ( install from my disk)

pip install torch torchvision torchaudio -f https://download.pytorch.org/whl/cu118/torch_stable.html

pip install --upgrade transformers torch

pip install bitsandbytes --upgrade

And this is erorr

(lora) R:\llama\lora>python app.py --data_dir="./data" --base_model='meta-llama/Llama-2-7b-chat-hf'
fatal: not a git repository (or any of the parent directories): .git
Cannot get git commit hash: Command '['git', 'rev-parse', 'HEAD']' returned non-zero exit status 128.
bin R:\llama\lora\lora\lib\site-packages\bitsandbytes\libbitsandbytes_cuda118.dll

GPU compute capability:  (8, 6)
GPU total number of SMs:  28
GPU total cores:  3584
GPU total memory: 12884901888 bytes (12288.00 MB) (12.00 GB)
CPU available memory: 52328894464 bytes (49904.72 MB) (48.74 GB)
Will keep 2 offloaded models in CPU RAM.

Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 2/2 [00:07<00:00,  3.58s/it]
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.
Loading base model meta-llama/Llama-2-7b-chat-hf...
Traceback (most recent call last):
  File "R:\llama\lora\llama_lora\ui\finetune\training.py", line 283, in training
    train_output = Global.finetune_train_fn(
  File "R:\llama\lora\llama_lora\lib\finetune.py", line 203, in train
    model = AutoModelForCausalLM.from_pretrained(
  File "R:\llama\lora\lora\lib\site-packages\transformers\models\auto\auto_factory.py", line 566, in from_pretrained
    return model_class.from_pretrained(
  File "R:\llama\lora\lora\lib\site-packages\transformers\modeling_utils.py", line 3236, in from_pretrained
    model = cls(config, *model_args, **model_kwargs)
TypeError: __init__() got an unexpected keyword argument 'llm_int8_skip_modules'

AlexeiKaDev avatar Nov 26 '23 20:11 AlexeiKaDev