text-generation-webui
text-generation-webui copied to clipboard
WSL problems
Describe the bug
error after error, one error replacing another error, errors to the left, errors to the right
Is there an existing issue for this?
- [X] I have searched the existing issues
Reproduction
.
Screenshot
.
Logs
(textgen) llama@SD:~/text-generation-webui/repositories/GPTQ-for-LLaMa$ python setup_cuda.py install
No CUDA runtime is found, using CUDA_HOME='/home/llama/miniconda3/envs/textgen'
running install
/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/command/easy_install.py:144: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
running bdist_egg
running egg_info
writing quant_cuda.egg-info/PKG-INFO
writing dependency_links to quant_cuda.egg-info/dependency_links.txt
writing top-level names to quant_cuda.egg-info/top_level.txt
/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/utils/cpp_extension.py:476: UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend.
warnings.warn(msg.format('we could not find ninja.'))
reading manifest file 'quant_cuda.egg-info/SOURCES.txt'
writing manifest file 'quant_cuda.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_ext
building 'quant_cuda' extension
creating build
creating build/temp.linux-x86_64-cpython-310
gcc -pthread -B /home/llama/miniconda3/envs/textgen/compiler_compat -Wno-unused-result -Wsign-compare -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem /home/llama/miniconda3/envs/textgen/include -fPIC -O2 -isystem /home/llama/miniconda3/envs/textgen/include -fPIC -I/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/include -I/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/include/torch/csrc/api/include -I/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/include/TH -I/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/include/THC -I/home/llama/miniconda3/envs/textgen/include -I/home/llama/miniconda3/envs/textgen/include/python3.10 -c quant_cuda.cpp -o build/temp.linux-x86_64-cpython-310/quant_cuda.o -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE=\"_gcc\" -DPYBIND11_STDLIB=\"_libstdcpp\" -DPYBIND11_BUILD_ABI=\"_cxxabi1011\" -DTORCH_EXTENSION_NAME=quant_cuda -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++17
Traceback (most recent call last):
File "/home/llama/text-generation-webui/repositories/GPTQ-for-LLaMa/setup_cuda.py", line 4, in <module>
setup(
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/__init__.py", line 87, in setup
return distutils.core.setup(**attrs)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 185, in setup
return run_commands(dist)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 201, in run_commands
dist.run_commands()
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 969, in run_commands
self.run_command(cmd)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/dist.py", line 1208, in run_command
super().run_command(command)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/command/install.py", line 74, in run
self.do_egg_install()
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/command/install.py", line 123, in do_egg_install
self.run_command('bdist_egg')
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
self.distribution.run_command(command)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/dist.py", line 1208, in run_command
super().run_command(command)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/command/bdist_egg.py", line 165, in run
cmd = self.call_command('install_lib', warn_dir=0)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/command/bdist_egg.py", line 151, in call_command
self.run_command(cmdname)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
self.distribution.run_command(command)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/dist.py", line 1208, in run_command
super().run_command(command)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/command/install_lib.py", line 11, in run
self.build()
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/command/install_lib.py", line 112, in build
self.run_command('build_ext')
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
self.distribution.run_command(command)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/dist.py", line 1208, in run_command
super().run_command(command)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/command/build_ext.py", line 84, in run
_build_ext.run(self)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 346, in run
self.build_extensions()
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 843, in build_extensions
build_ext.build_extensions(self)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 468, in build_extensions
self._build_extensions_serial()
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 494, in _build_extensions_serial
self.build_extension(ext)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/command/build_ext.py", line 246, in build_extension
_build_ext.build_extension(self, ext)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 549, in build_extension
objects = self.compiler.compile(
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/_distutils/ccompiler.py", line 599, in compile
self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 581, in unix_wrap_single_compile
cflags = unix_cuda_flags(cflags)
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 548, in unix_cuda_flags
cflags + _get_cuda_arch_flags(cflags))
File "/home/llama/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1773, in _get_cuda_arch_flags
arch_list[-1] += '+PTX'
IndexError: list index out of range
System Info
wsl windows 10
You have to do the symbolic link dirty business. WSL's implementation of graphics drivers (including CUDA) is very... special.
You have to do the symbolic link dirty business. WSL's implementation of graphics drivers (including CUDA) is very... special.
is there a link to what you're talking about?
Sorry, I forgor to link. Here it is
It's very hacky, but it's a permanent solution and has never failed with anyone I've explained it to.
Sorry, I forgor to link. Here it is
It's very hacky, but it's a permanent solution and has never failed with anyone I've explained it to.
Which one of the tens of options listed there? Thanks
I wonder if my Nvidia 980ti gtx 6 gig graphics card is just not supported in this repo
(textgen) freddy@SD:~/text-generation-webui/repositories/GPTQ-for-LLaMa$ python setup_cuda.py install
No CUDA runtime is found, using CUDA_HOME='/home/freddy/miniconda3/envs/textgen'
running install
/home/freddy/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
/home/freddy/miniconda3/envs/textgen/lib/python3.10/site-packages/setuptools/command/easy_install.py:144: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
running bdist_egg
running egg_info
writing quant_cuda.egg-info/PKG-INFO
writing dependency_links to quant_cuda.egg-info/dependency_links.txt
writing top-level names to quant_cuda.egg-info/top_level.txt
reading manifest file 'quant_cuda.egg-info/SOURCES.txt'
writing manifest file 'quant_cuda.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_ext
building 'quant_cuda' extension
Traceback (most recent call last):
File "/home/freddy/text-generation-webui/repositories/GPTQ-for-LLaMa/setup_cuda.py", line 4, in
my final error
A 960? AFAIK, recent era CUDA builds only support back to the 10 series.
e.g. even if you compile libcudaall for bitsandbytes it'll still fail on an old 9 series.
A 960? AFAIK, recent era CUDA builds only support back to the 10 series.
e.g. even if you compile libcudaall for bitsandbytes it'll still fail on an old 9 series.
oof, i thought something like that might be true.
i was hoping i just had an issue like this: https://github.com/pytorch/extension-cpp/issues/71
This issue has been closed due to inactivity for 30 days. If you believe it is still relevant, please leave a comment below.