text-generation-webui icon indicating copy to clipboard operation
text-generation-webui copied to clipboard

undefined symbol: cget_col_row_stats / 8-bit not working / libsbitsandbytes_cpu.so not found

Open AlexysLovesLexxie opened this issue 1 year ago • 25 comments

Describe the bug

On starting the server, I recieve the following error messages :

===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
================================================================================
CUDA SETUP: Required library version not found: libsbitsandbytes_cpu.so. Maybe you need to compile it from source?
CUDA SETUP: Defaulting to libbitsandbytes_cpu.so...
argument of type 'WindowsPath' is not iterable
CUDA SETUP: Required library version not found: libsbitsandbytes_cpu.so. Maybe you need to compile it from source?
CUDA SETUP: Defaulting to libbitsandbytes_cpu.so...
argument of type 'WindowsPath' is not iterable
C:\Oobabooga_new\installer_files\env\lib\site-packages\bitsandbytes\cextension.py:31: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers and GPU quantization are unavailable.
  warn("The installed version of bitsandbytes was compiled without GPU support. "

This is not the same as #388.

Is there an existing issue for this?

  • [X] I have searched the existing issues

Reproduction

Start web UI using the supplied batch file.

Screenshot

Screenshot 2023-03-18 024607

Logs

None.  See screenshot.

System Info

Windows 11
No GPU, CPU only
CPU : Ryzen 7 6800H
RAM : 32Gb

AlexysLovesLexxie avatar Mar 18 '23 11:03 AlexysLovesLexxie

I have an exact same issue with Nvidia GPU and Win10, tried a new install several times — nothing seems to work. Very frustrating, given the fact that yesterday it worked just fine. Shouldn't have done git pull today, it seem to have broken the UI.

KirillRepinArt avatar Mar 18 '23 11:03 KirillRepinArt

Same issue on linux as well.

I even tried inside a 11.8.0-runtime-ubuntu22.04 Nvidia container

fuomag9 avatar Mar 18 '23 11:03 fuomag9

I'm on CPU. It does work, but I'm not sure I'm getting the best out of it. Still getting short, low-quality responses with very little RP, which is why I did a fresh install.

On Sat., Mar. 18, 2023, 4:51 a.m. fuomag9, @.***> wrote:

Same issue on linux as well.

I even tried inside a 11.8.0-runtime-ubuntu22.04 https://hub.docker.com/layers/nvidia/cuda/11.8.0-runtime-ubuntu22.04/images/sha256-61187bc58b1411daa436202bebc96022e9c5339611589a022cd913b1b54cdead?context=explore Nvidia container

— Reply to this email directly, view it on GitHub https://github.com/oobabooga/text-generation-webui/issues/400#issuecomment-1474824582, or unsubscribe https://github.com/notifications/unsubscribe-auth/A6I5UHICGDV7OML6GCZFFDTW4WOS7ANCNFSM6AAAAAAV7NCHH4 . You are receiving this because you authored the thread.Message ID: @.***>

AlexysLovesLexxie avatar Mar 18 '23 11:03 AlexysLovesLexxie

That's just a warning, not a bug

oobabooga avatar Mar 18 '23 13:03 oobabooga

But it says no GPU detected, falling back to CPU, I'd assume that's not the correct behavior?

KirillRepinArt avatar Mar 18 '23 13:03 KirillRepinArt

OP doesn't have a GPU, so it's expected behavior

On Windows, I recommend installing using the new WSL recommended method

oobabooga avatar Mar 18 '23 13:03 oobabooga

OP doesn't have a GPU, so it's expected behavior

On Windows, I recommend installing using the new WSL recommended method

In my case I had the same issue and I have a gpu passed with --gpus=all inside docker :(

fuomag9 avatar Mar 18 '23 13:03 fuomag9

OP doesn't have a GPU, so it's expected behavior

On Windows, I recommend installing using the new WSL recommended method

Unfortunately it's not possible, Microsoft store doesn't work in my country. Is it possible to download the previous working version of UI somewhere?

KirillRepinArt avatar Mar 18 '23 14:03 KirillRepinArt

You may be better just running an Ubuntu VM, your GPU should pass through

olihough86 avatar Mar 18 '23 14:03 olihough86

I have deleted my conda environment and created a new one following the README and now I also can't use 8bit heh

undefined symbol: cget_col_row_stats

https://github.com/TimDettmers/bitsandbytes/issues/112

oobabooga avatar Mar 18 '23 14:03 oobabooga

Just tried to run: conda install torchvision=0.14.1 torchaudio=0.13.1 pytorch-cuda=11.7 -c pytorch -c nvidia and git pulled my local folder Everything went successfully but now I'm getting:

Traceback (most recent call last): File "F:\Anakonda3\envs\textgen_webui_04\lib\site-packages\requests\compat.py", line 11, in import chardet ModuleNotFoundError: No module named 'chardet'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "F:\Program Files (x86)\textgen_webui_04\text-generation-webui\server.py", line 10, in import gradio as gr File "F:\Anakonda3\envs\textgen_webui_04\lib\site-packages\gradio_init_.py", line 3, in import gradio.components as components File "F:\Anakonda3\envs\textgen_webui_04\lib\site-packages\gradio\components.py", line 34, in from gradio import media_data, processing_utils, utils File "F:\Anakonda3\envs\textgen_webui_04\lib\site-packages\gradio\processing_utils.py", line 19, in import requests File "F:\Anakonda3\envs\textgen_webui_04\lib\site-packages\requests_init_.py", line 45, in from .exceptions import RequestsDependencyWarning File "F:\Anakonda3\envs\textgen_webui_04\lib\site-packages\requests\exceptions.py", line 9, in from .compat import JSONDecodeError as CompatJSONDecodeError File "F:\Anakonda3\envs\textgen_webui_04\lib\site-packages\requests\compat.py", line 13, in import charset_normalizer as chardet File "F:\Anakonda3\envs\textgen_webui_04\lib\site-packages\charset_normalizer_init_.py", line 23, in from charset_normalizer.api import from_fp, from_path, from_bytes, normalize File "F:\Anakonda3\envs\textgen_webui_04\lib\site-packages\charset_normalizer\api.py", line 10, in from charset_normalizer.md import mess_ratio File "charset_normalizer\md.py", line 5, in ImportError: cannot import name 'COMMON_SAFE_ASCII_CHARACTERS' from 'charset_normalizer.constant' (F:\Anakonda3\envs\textgen_webui_04\lib\site-packages\charset_normalizer\constant.py)

KirillRepinArt avatar Mar 18 '23 14:03 KirillRepinArt

Installing those older versions had worked for me briefly, then it stopped working again.

oobabooga avatar Mar 18 '23 14:03 oobabooga

I have deleted my conda environment and created a new one following the README and now I also can't use 8bit heh

undefined symbol: cget_col_row_stats

TimDettmers/bitsandbytes#112

gettings this one as well too

fuomag9 avatar Mar 18 '23 14:03 fuomag9

This may be relevant

https://github.com/TimDettmers/bitsandbytes/issues/156#issuecomment-1462329713

oobabooga avatar Mar 18 '23 15:03 oobabooga

Nothing changed in bits&bytes.

Ph0rk0z avatar Mar 18 '23 15:03 Ph0rk0z

Ok I got it

  1. Start over
conda deactivate
conda remove -n textgen --all
conda create -n textgen python=3.10.9
conda activate textgen
pip3 install torch torchvision torchaudio
cd text-generation-webui
pip install -r requirements.txt
  1. Do the dirty fix in https://github.com/TimDettmers/bitsandbytes/issues/156#issuecomment-1462329713:
cd /home/yourname/miniconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/
cp libbitsandbytes_cuda120.so libbitsandbytes_cpu.so
cd -
  1. Install cudatoolkit
conda install cudatoolkit
  1. It now works
python server.py --listen --model llama-7b  --lora alpaca-lora-7b  --load-in-8bit

oobabooga avatar Mar 18 '23 15:03 oobabooga

This may be relevant

TimDettmers/bitsandbytes#156 (comment)

Running pip3 install torch torchvision torchaudio in the new commit + replacing the cpu file with the cuda117 file seemed to have fixed:

undefined symbol: cget_col_row_stats

For me.

Arargd avatar Mar 18 '23 15:03 Arargd

Nothing changed in bits&bytes.

I think the problem was the recent pytorch update.

oobabooga avatar Mar 18 '23 15:03 oobabooga

Ok I got it

  1. Start over
conda deactivate
conda remove -n textgen --all
conda create -n textgen python=3.10.9
conda activate textgen
pip3 install torch torchvision torchaudio
cd text-generation-webui
pip install -r requirements.txt
  1. Do the dirty fix in bitsandbytes/libbitsandbytes_cpu.so: undefined symbol: cget_col_row_stats TimDettmers/bitsandbytes#156 (comment):
cd /home/yourname/miniconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/
cp libbitsandbytes_cuda120.so libbitsandbytes_cpu.so
cd -
  1. Install cudatoolkit
conda install cudatoolkit
  1. It now works
python server.py --listen --model llama-7b  --lora alpaca-lora-7b  --load-in-8bit

Doing git pull and then this worked for me as well!

I am using miniconda so my folder was /home/$USER/.conda/envs/textgen/lib/python3.10/site-packages/bitsandbytes/

fuomag9 avatar Mar 18 '23 16:03 fuomag9

conda install cudatoolkit

I'm using Anaconda3, so I couldn't do the step 2, just can't find the folders, but I did everything else and was able to launch the UI, seems to be working fine right now, thank you!

Although I've found those files in F:\Anakonda3\envs\textgen_webui_05\Lib\site-packages\bitsandbytes are those the same files?

KirillRepinArt avatar Mar 18 '23 16:03 KirillRepinArt

So I've changed those files in F:\Anakonda3\envs\textgen_webui_05\Lib\site-packages\bitsandbytes nothing seem to change though, still gives the warning: Warning: torch.cuda.is_available() returned False. It works, but doesn't seem to use GPU at all.

Also llama-7b-hf --gptq-bits 4 doesn't work anymore, although it used to in the previous version of UI. Says CUDA extension not installed. It was possible before to load llama-13b-hf --auto-devices --gpu-memory 4 but now it just eats all of 32 Gb Ram, so I aborted it.

KirillRepinArt avatar Mar 18 '23 17:03 KirillRepinArt

Ok I got it

1. Start over
conda deactivate
conda remove -n textgen --all
conda create -n textgen python=3.10.9
conda activate textgen
pip3 install torch torchvision torchaudio
cd text-generation-webui
pip install -r requirements.txt
2. Do the dirty fix in [bitsandbytes/libbitsandbytes_cpu.so: undefined  symbol: cget_col_row_stats TimDettmers/bitsandbytes#156 (comment)](https://github.com/TimDettmers/bitsandbytes/issues/156#issuecomment-1462329713):
cd /home/yourname/miniconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/
cp libbitsandbytes_cuda120.so libbitsandbytes_cpu.so
cd -
3. Install cudatoolkit
conda install cudatoolkit
4. It now works
python server.py --listen --model llama-7b  --lora alpaca-lora-7b  --load-in-8bit

I had a problem with these instructions which I narrowed down to this line:

pip3 install torch torchvision torchaudio

PyTorch has now updated to 2.0.0 and so running this command will install 2.0.0, but errors occur when running this code using 2.0.0 and using

conda install cudatoolkit

would install a version of cuda which is not compatible with PyTorch 2.0.0, resulting in @KirillRepinArt's error:

Warning: torch.cuda.is_available() returned False.

To fix this, simply install the version of PyTorch immediately preceding 2.0.0. I did this using the command from the PyTorch website instead: pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu116

I also didn't have to do conda install cudatoolkit after using this pip command.

xNul avatar Mar 19 '23 00:03 xNul

Ok I got it

1. Start over
conda deactivate
conda remove -n textgen --all
conda create -n textgen python=3.10.9
conda activate textgen
pip3 install torch torchvision torchaudio
cd text-generation-webui
pip install -r requirements.txt
2. Do the dirty fix in [bitsandbytes/libbitsandbytes_cpu.so: undefined  symbol: cget_col_row_stats TimDettmers/bitsandbytes#156 (comment)](https://github.com/TimDettmers/bitsandbytes/issues/156#issuecomment-1462329713):
cd /home/yourname/miniconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/
cp libbitsandbytes_cuda120.so libbitsandbytes_cpu.so
cd -
3. Install cudatoolkit
conda install cudatoolkit
4. It now works
python server.py --listen --model llama-7b  --lora alpaca-lora-7b  --load-in-8bit

I had a problem with these instructions which I narrowed down to this line:

pip3 install torch torchvision torchaudio

PyTorch has now updated to 2.0.0 and so running this command will install 2.0.0, but errors occur when running this code using 2.0.0 and using

conda install cudatoolkit

would install a version of cuda which is not compatible with PyTorch 2.0.0, resulting in @KirillRepinArt's error:

Warning: torch.cuda.is_available() returned False.

To fix this, simply install the version of PyTorch immediately preceding 2.0.0. I did this using the command from the PyTorch website instead: pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu116

I also didn't have to do conda install cudatoolkit after using this pip command.

This worked for me, thank you! I had to use though pip install torch==1.13.1+cu117 torchvision==0.14.1+cu117 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu117 for cuda_11.7 and I didn't do conda install cudatoolkit also. Now it seems to be working as in the previous state, uses GPU, I can load llama-7b-hf --cai-chat --gptq-bits 4

As in the previous version now --load-in-8bit doesn't work for me anymore, gives CUDA Setup failed despite GPU being available. I also can't load --model llama-13b-hf --gptq-bits 4 --cai-chat --auto-devices --gpu-memory 4, gives me torch.cuda.OutOfMemoryError: CUDA out of memory.

But I had this issues before the last update, and everything that worked previously is also working now, so thanks again!

KirillRepinArt avatar Mar 19 '23 06:03 KirillRepinArt

i tried the command and got this error (d:\myenvs\textgen1) D:\text-generation-webui\repositories\GPTQ-for-LLaMa>pip install torch==1.13.1+cu117 torchvision==0.14.1+cu117 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu117 Looking in indexes: https://pypi.org/simple, https://download.pytorch.org/whl/cu117 Collecting torch==1.13.1+cu117 Using cached https://download.pytorch.org/whl/cu117/torch-1.13.1%2Bcu117-cp310-cp310-win_amd64.whl (2255.4 MB) Collecting torchvision==0.14.1+cu117 Using cached https://download.pytorch.org/whl/cu117/torchvision-0.14.1%2Bcu117-cp310-cp310-win_amd64.whl (4.8 MB) Collecting torchaudio==0.13.1 Using cached https://download.pytorch.org/whl/cu117/torchaudio-0.13.1%2Bcu117-cp310-cp310-win_amd64.whl (2.3 MB) Requirement already satisfied: typing-extensions in d:\myenvs\textgen1\lib\site-packages (from torch==1.13.1+cu117) (4.5.0) Requirement already satisfied: numpy in d:\myenvs\textgen1\lib\site-packages (from torchvision==0.14.1+cu117) (1.24.2) Requirement already satisfied: requests in d:\myenvs\textgen1\lib\site-packages (from torchvision==0.14.1+cu117) (2.28.2) Requirement already satisfied: pillow!=8.3.*,>=5.3.0 in d:\myenvs\textgen1\lib\site-packages (from torchvision==0.14.1+cu117) (9.4.0) Requirement already satisfied: certifi>=2017.4.17 in d:\myenvs\textgen1\lib\site-packages (from requests->torchvision==0.14.1+cu117) (2022.12.7) Requirement already satisfied: charset-normalizer<4,>=2 in d:\myenvs\textgen1\lib\site-packages (from requests->torchvision==0.14.1+cu117) (3.1.0) Requirement already satisfied: idna<4,>=2.5 in d:\myenvs\textgen1\lib\site-packages (from requests->torchvision==0.14.1+cu117) (3.4) Requirement already satisfied: urllib3<1.27,>=1.21.1 in d:\myenvs\textgen1\lib\site-packages (from requests->torchvision==0.14.1+cu117) (1.26.15) Installing collected packages: torch, torchvision, torchaudio Attempting uninstall: torch Found existing installation: torch 2.0.0 Uninstalling torch-2.0.0: Successfully uninstalled torch-2.0.0 Attempting uninstall: torchvision Found existing installation: torchvision 0.15.0 Uninstalling torchvision-0.15.0: Successfully uninstalled torchvision-0.15.0 Attempting uninstall: torchaudio Found existing installation: torchaudio 2.0.0 Uninstalling torchaudio-2.0.0: Successfully uninstalled torchaudio-2.0.0 Successfully installed torch-1.13.1+cu117 torchaudio-0.13.1+cu117 torchvision-0.14.1+cu117

(d:\myenvs\textgen1) D:\text-generation-webui\repositories\GPTQ-for-LLaMa>python setup_cuda.py install running install d:\myenvs\textgen1\lib\site-packages\setuptools\command\install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools. warnings.warn( d:\myenvs\textgen1\lib\site-packages\setuptools\command\easy_install.py:144: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools. warnings.warn( running bdist_egg running egg_info writing quant_cuda.egg-info\PKG-INFO writing dependency_links to quant_cuda.egg-info\dependency_links.txt writing top-level names to quant_cuda.egg-info\top_level.txt reading manifest file 'quant_cuda.egg-info\SOURCES.txt' writing manifest file 'quant_cuda.egg-info\SOURCES.txt' installing library code to build\bdist.win-amd64\egg running install_lib running build_ext d:\myenvs\textgen1\lib\site-packages\torch\utils\cpp_extension.py:358: UserWarning: Error checking compiler version for cl: [WinError 2] The system cannot find the file specified warnings.warn(f'Error checking compiler version for {compiler}: {error}') Traceback (most recent call last): File "D:\text-generation-webui\repositories\GPTQ-for-LLaMa\setup_cuda.py", line 4, in setup( File "d:\myenvs\textgen1\lib\site-packages\setuptools_init_.py", line 87, in setup return distutils.core.setup(**attrs) File "d:\myenvs\textgen1\lib\site-packages\setuptools_distutils\core.py", line 185, in setup return run_commands(dist) File "d:\myenvs\textgen1\lib\site-packages\setuptools_distutils\core.py", line 201, in run_commands dist.run_commands() File "d:\myenvs\textgen1\lib\site-packages\setuptools_distutils\dist.py", line 969, in run_commands self.run_command(cmd) File "d:\myenvs\textgen1\lib\site-packages\setuptools\dist.py", line 1208, in run_command super().run_command(command) File "d:\myenvs\textgen1\lib\site-packages\setuptools_distutils\dist.py", line 988, in run_command cmd_obj.run() File "d:\myenvs\textgen1\lib\site-packages\setuptools\command\install.py", line 74, in run self.do_egg_install() File "d:\myenvs\textgen1\lib\site-packages\setuptools\command\install.py", line 123, in do_egg_install self.run_command('bdist_egg') File "d:\myenvs\textgen1\lib\site-packages\setuptools_distutils\cmd.py", line 318, in run_command self.distribution.run_command(command) File "d:\myenvs\textgen1\lib\site-packages\setuptools\dist.py", line 1208, in run_command super().run_command(command) File "d:\myenvs\textgen1\lib\site-packages\setuptools_distutils\dist.py", line 988, in run_command cmd_obj.run() File "d:\myenvs\textgen1\lib\site-packages\setuptools\command\bdist_egg.py", line 165, in run cmd = self.call_command('install_lib', warn_dir=0) File "d:\myenvs\textgen1\lib\site-packages\setuptools\command\bdist_egg.py", line 151, in call_command self.run_command(cmdname) File "d:\myenvs\textgen1\lib\site-packages\setuptools_distutils\cmd.py", line 318, in run_command self.distribution.run_command(command) File "d:\myenvs\textgen1\lib\site-packages\setuptools\dist.py", line 1208, in run_command super().run_command(command) File "d:\myenvs\textgen1\lib\site-packages\setuptools_distutils\dist.py", line 988, in run_command cmd_obj.run() File "d:\myenvs\textgen1\lib\site-packages\setuptools\command\install_lib.py", line 11, in run self.build() File "d:\myenvs\textgen1\lib\site-packages\setuptools_distutils\command\install_lib.py", line 112, in build self.run_command('build_ext') File "d:\myenvs\textgen1\lib\site-packages\setuptools_distutils\cmd.py", line 318, in run_command self.distribution.run_command(command) File "d:\myenvs\textgen1\lib\site-packages\setuptools\dist.py", line 1208, in run_command super().run_command(command) File "d:\myenvs\textgen1\lib\site-packages\setuptools_distutils\dist.py", line 988, in run_command cmd_obj.run() File "d:\myenvs\textgen1\lib\site-packages\setuptools\command\build_ext.py", line 84, in run _build_ext.run(self) File "d:\myenvs\textgen1\lib\site-packages\setuptools_distutils\command\build_ext.py", line 346, in run self.build_extensions() File "d:\myenvs\textgen1\lib\site-packages\torch\utils\cpp_extension.py", line 499, in build_extensions _check_cuda_version(compiler_name, compiler_version) File "d:\myenvs\textgen1\lib\site-packages\torch\utils\cpp_extension.py", line 386, in _check_cuda_version raise RuntimeError(CUDA_MISMATCH_MESSAGE.format(cuda_str_version, torch.version.cuda)) RuntimeError: The detected CUDA version (12.0) mismatches the version that was used to compile PyTorch (11.7). Please make sure to use the same CUDA versions.

gsgoldma avatar Mar 20 '23 19:03 gsgoldma

@gsgoldma I ran into this error as well. Your CUDA version is 12.0 which isn't compatible with your PyTorch version 11.7. You need to downgrade your CUDA version to one that is compatible with PyTorch 11.7. You could try redoing everything with my instructions as well.

xNul avatar Mar 21 '23 14:03 xNul

Works on linux with CUDA 12.1: NVIDIA-SMI 530.30.02 Driver Version: 530.30.02 CUDA Version: 12.1

quarterturn avatar Mar 27 '23 13:03 quarterturn

Ok I got it

1. Start over
conda deactivate
conda remove -n textgen --all
conda create -n textgen python=3.10.9
conda activate textgen
pip3 install torch torchvision torchaudio
cd text-generation-webui
pip install -r requirements.txt
2. Do the dirty fix in [bitsandbytes/libbitsandbytes_cpu.so: undefined  symbol: cget_col_row_stats TimDettmers/bitsandbytes#156 (comment)](https://github.com/TimDettmers/bitsandbytes/issues/156#issuecomment-1462329713):
cd /home/yourname/miniconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/
cp libbitsandbytes_cuda120.so libbitsandbytes_cpu.so
cd -
3. Install cudatoolkit
conda install cudatoolkit
4. It now works
python server.py --listen --model llama-7b  --lora alpaca-lora-7b  --load-in-8bit

Note that on windows, if you have Python 3.10 set as sys path variable, the python 3.10 directory is entirely skipped. So the path is "cd Drive path/users/yourname/etcetcetc/miniconda3/envs/textgen/lib/site-packages/bitsandbytes/".

bucketcat avatar Apr 03 '23 00:04 bucketcat

I got the same issue when using the new one-click-installer, even though it is supposed to do the dirty fixes automatically. Nvidia gpu is not recognized, and it uses only CPU when I try to --load-in-8bit

CUDA SETUP: Required library version not found: libsbitsandbytes_cpu.so. Maybe you need to compile it from source? CUDA SETUP: Defaulting to libbitsandbytes_cpu.so... argument of type 'WindowsPath' is not iterable

MikkoHaavisto avatar Apr 03 '23 05:04 MikkoHaavisto

cc @jllllll

oobabooga avatar Apr 03 '23 16:04 oobabooga

I got the same issue when using the new one-click-installer, even though it is supposed to do the dirty fixes automatically. Nvidia gpu is not recognized, and it uses only CPU when I try to --load-in-8bit

CUDA SETUP: Required library version not found: libsbitsandbytes_cpu.so. Maybe you need to compile it from source? CUDA SETUP: Defaulting to libbitsandbytes_cpu.so... argument of type 'WindowsPath' is not iterable

There are no dirty fixes anymore. Try this: https://github.com/oobabooga/text-generation-webui/issues/659#issuecomment-1493555255

Also, you may have installed the cpu version of torch. I've seen that happen before, though I don't know the cause. You can try this to replace it:

python -m pip install torch --index-url https://download.pytorch.org/whl/cu117 --force-reinstall
--OR--
python -m pip install https://download.pytorch.org/whl/cu117/torch-2.0.0%2Bcu117-cp310-cp310-win_amd64.whl  --force-reinstall

This will tell you about your torch installation: python -m torch.utils.collect_env

jllllll avatar Apr 03 '23 18:04 jllllll

Ok I got it

  1. Start over
conda deactivate
conda remove -n textgen --all
conda create -n textgen python=3.10.9
conda activate textgen
pip3 install torch torchvision torchaudio
cd text-generation-webui
pip install -r requirements.txt
  1. Do the dirty fix in bitsandbytes/libbitsandbytes_cpu.so: undefined symbol: cget_col_row_stats TimDettmers/bitsandbytes#156 (comment):
cd /home/yourname/miniconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/
cp libbitsandbytes_cuda120.so libbitsandbytes_cpu.so
cd -
  1. Install cudatoolkit
conda install cudatoolkit
  1. It now works
python server.py --listen --model llama-7b  --lora alpaca-lora-7b  --load-in-8bit

you forgot an s cp libbitsandbytes_cuda120.so libsbitsandbytes_cpu.so

belqit avatar Apr 04 '23 15:04 belqit