llm_qlora
llm_qlora copied to clipboard
"bitsandbytes was compiled without GPU support"
Here's what happens on my RTX 3090 system: (qlora) PS C:\Users\ron\qlora> git clone https://github.com/georgesung/llm_qlora.git (qlora) PS C:\Users\ron\qlora> cd llm-qlora (qlora) PS C:\Users\ron\qlora\llm_qlora> pip install -r requirements.txt (qlora) PS C:\Users\ron\qlora\llm_qlora> python.exe -V Python 3.10.12 (qlora) PS C:\Users\ron\qlora\llm_qlora> python train.py configs/open_llama_7b_qlora_uncensored.yaml C:\Users\ron\miniconda3\lib\site-packages\bitsandbytes\cextension.py:34: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable. warn("The installed version of bitsandbytes was compiled without GPU support. " 'NoneType' object has no attribute 'cadam32bit_grad_fp32' ImportError: cannot import name 'BitsAndBytesConfig' from 'transformers' (C:\Users\ron\miniconda3\lib\site-packages\transformers_init_.py) (qlora) PS C:\Users\ron\qlora\llm_qlora>