llm-awq
llm-awq copied to clipboard
ModuleNotFoundError: No module named 'awq_inference_engine'
I installed it according to the documentation step by step:
https://github.com/mit-han-lab/llm-awq#install
git clone https://github.com/mit-han-lab/llm-awq
cd llm-awq
pip3 install -e .
cd awq/kernels
python3 setup.py install
But I got an error,
I copied the .so file to the current directory and tried again:
awesome!
https://stackoverflow.com/questions/65710713/importerror-libc10-so-cannot-open-shared-object-file-no-such-file-or-director
I also get this error when I try to run the cli after installing AWQ per instructions
(env-fastchat-awq)Russells-MBP:llm-awq $ python3 -m fastchat.serve.cli --model-path models/git-vicuna-7b-awq/ --awq-wbits 4 --awq-groupsize 128
Loading AWQ quantized model...
Error: Failed to import tinychat. No module named 'awq_inference_engine'
Please double check if you have successfully installed AWQ
See https://github.com/lm-sys/FastChat/blob/main/docs/awq.md
Does FastChat only work if one has Nvidia GPU? I do not. My understanding is CUDA is only for Nvidia, which is why I was assuming the following error occured for me while installing:
(env-fastchat-awq)Russells-MBP:llm-awq $ cd awq/kernels/
(env-fastchat-awq)Russells-MBP:kernels $ python setup.py install
Traceback (most recent call last):
File "/Volumes/ExtremePro/fastchat-awq/FastChat/repositories/llm-awq/awq/kernels/setup.py", line 35, in <module>
CUDAExtension(
File "/Volumes/ExtremePro/fastchat-awq/env-fastchat-awq/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1076, in CUDAExtension
library_dirs += library_paths(cuda=True)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/ExtremePro/fastchat-awq/env-fastchat-awq/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1203, in library_paths
if (not os.path.exists(_join_cuda_home(lib_dir)) and
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/ExtremePro/fastchat-awq/env-fastchat-awq/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 2416, in _join_cuda_home
raise OSError('CUDA_HOME environment variable is not set. '
OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root.
(env-fastchat-awq)Russells-MBP:kernels $
(env-fastchat-awq)Russells-MBP:fastchat-awq $
which is why I was assuming the following error occured for me while installing:
I also have this question
which is why I was assuming the following error occured for me while installing:
I also have this question
I met this error too, how did you solve it?Thanks!
I copied the .so file to the current directory and tried again:
Where did you get this file?
which is why I was assuming the following error occured for me while installing:
I also have this question
I met this error too, how did you solve it?Thanks!
I gave up, autoawq is more convenient,https://github.com/casper-hansen/AutoAWQ
which is why I was assuming the following error occured for me while installing:
I also have this question
I met this error too, how did you solve it?Thanks!
I gave up, autoawq is more convenient,https://github.com/casper-hansen/AutoAWQ
OK, thanks!
