CodeUp icon indicating copy to clipboard operation
CodeUp copied to clipboard

Issue running the codeup with WSL

Open raymond-infinitecode opened this issue 10 months ago • 0 comments

python3 finetune.py --base_model='TheBloke/Dolphin-Llama2-7B-GPTQ' --data_path='data/codeup_19k.json' --num_epochs=10 --cutoff_len=512 --group_by_length --output_dir='./test-llama-2/7b' --lora_target_modules='[q_proj,k_proj,v_proj,o_proj]' --lora_r=16 --micro_batch_size

===================================BUG REPORT=================================== Welcome to bitsandbytes. For bug reports, please run

python -m bitsandbytes

and submit this information together with your error trace to: https://github.com/TimDettmers/bitsandbytes/issues

bin /home/raymond/.local/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda121.so CUDA SETUP: CUDA runtime path found: /usr/local/cuda/lib64/libcudart.so CUDA SETUP: Highest compute capability among GPUs detected: 8.6 CUDA SETUP: Detected CUDA version 121 CUDA SETUP: Loading binary /home/raymond/.local/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda121.so... Training Alpaca-LoRA model with params: base_model: TheBloke/Dolphin-Llama2-7B-GPTQ data_path: data/codeup_19k.json output_dir: ./test-llama-2/7b batch_size: 128 micro_batch_size: True num_epochs: 10 learning_rate: 0.0003 cutoff_len: 512 val_set_size: 2000 lora_r: 16 lora_alpha: 16 lora_dropout: 0.05 lora_target_modules: ['q_proj', 'k_proj', 'v_proj', 'o_proj'] train_on_inputs: True add_eos_token: False group_by_length: True wandb_project: wandb_run_name: wandb_watch: wandb_log_model: resume_from_checkpoint: False prompt template: alpaca

Traceback (most recent call last): File "/home/raymond/CodeUp/finetune.py", line 283, in fire.Fire(train) File "/home/raymond/.local/lib/python3.10/site-packages/fire/core.py", line 141, in Fire component_trace = _Fire(component, args, parsed_flag_args, context, name) File "/home/raymond/.local/lib/python3.10/site-packages/fire/core.py", line 475, in _Fire component, remaining_args = _CallAndUpdateTrace( File "/home/raymond/.local/lib/python3.10/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace component = fn(*varargs, **kwargs) File "/home/raymond/CodeUp/finetune.py", line 112, in train model = LlamaForCausalLM.from_pretrained( File "/home/raymond/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2475, in from_pretrained loading_attr_dict = quantization_config.get_loading_attributes() AttributeError: 'BitsAndBytesConfig' object has no attribute 'get_loading_attributes'

CUDA version 12.1 and other pip things followed requirements.txt

Do you have any idea ?

raymond-infinitecode avatar Aug 26 '23 12:08 raymond-infinitecode