text-generation-webui
text-generation-webui copied to clipboard
fatal error: 'thrust/complex.h' file not found
Describe the bug
When I run "python setup_cuda.py install" from this guide https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model#4-bit-mode I get this error.
Is there an existing issue for this?
- [X] I have searched the existing issues
Reproduction
I followed this guide to the letter: https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model#4-bit-mode
The issue comes with the last command from this part:
mkdir repositories cd repositories git clone https://github.com/qwopqwop200/GPTQ-for-LLaMa cd GPTQ-for-LLaMa git reset --hard 468c47c01b4fe370616747b6d69a2d3f48bab5e4 python setup_cuda.py install <------- THIS
Screenshot
No response
Logs
In file included from /home/christopher/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/include/c10/util/Half.h:15:
/home/christopher/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/include/c10/util/complex.h:8:10: fatal error: 'thrust/complex.h' file not found
#include <thrust/complex.h>
^~~~~~~~~~~~~~~~~~
29 warnings and 1 error generated when compiling for gfx1030.
error: command '/opt/rocm-5.4.3/bin/hipcc' failed with exit code 1
System Info
OS: Ubuntu 22.10 64 bit
CPU: Intel(R) Xeon(R) CPU E5-2620 v2 @ 2.10GHz
GPU: AMD Radeon RX 6600
I think the 4bit kernel requires CUDA, so AMD is not supported.
I think the 4bit kernel requires CUDA, so AMD is not supported.
Someone said to have it working on AMD. I'm so confused. He's helping me out here: https://github.com/oobabooga/text-generation-webui/issues/166
This issue has been closed due to inactivity for 30 days. If you believe it is still relevant, please leave a comment below.