llm-awq icon indicating copy to clipboard operation
llm-awq copied to clipboard

ModuleNotFoundError: No module named 'awq_inference_engine'

Open Hukongtao opened this issue 2 years ago • 8 comments

I installed it according to the documentation step by step:
https://github.com/mit-han-lab/llm-awq#install

git clone https://github.com/mit-han-lab/llm-awq
cd llm-awq
pip3 install -e .
cd awq/kernels
python3 setup.py install

image

But I got an error,
image

Hukongtao avatar Jul 27 '23 09:07 Hukongtao

I copied the .so file to the current directory and tried again:
image

Hukongtao avatar Jul 27 '23 10:07 Hukongtao

awesome! image https://stackoverflow.com/questions/65710713/importerror-libc10-so-cannot-open-shared-object-file-no-such-file-or-director

Hukongtao avatar Jul 27 '23 10:07 Hukongtao

I also get this error when I try to run the cli after installing AWQ per instructions

(env-fastchat-awq)Russells-MBP:llm-awq $ python3 -m fastchat.serve.cli --model-path models/git-vicuna-7b-awq/ --awq-wbits 4 --awq-groupsize 128
Loading AWQ quantized model...
Error: Failed to import tinychat. No module named 'awq_inference_engine'
Please double check if you have successfully installed AWQ
See https://github.com/lm-sys/FastChat/blob/main/docs/awq.md

Does FastChat only work if one has Nvidia GPU? I do not. My understanding is CUDA is only for Nvidia, which is why I was assuming the following error occured for me while installing:

(env-fastchat-awq)Russells-MBP:llm-awq $ cd awq/kernels/
(env-fastchat-awq)Russells-MBP:kernels $ python setup.py install
Traceback (most recent call last):
  File "/Volumes/ExtremePro/fastchat-awq/FastChat/repositories/llm-awq/awq/kernels/setup.py", line 35, in <module>
    CUDAExtension(
  File "/Volumes/ExtremePro/fastchat-awq/env-fastchat-awq/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1076, in CUDAExtension
    library_dirs += library_paths(cuda=True)
                    ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Volumes/ExtremePro/fastchat-awq/env-fastchat-awq/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1203, in library_paths
    if (not os.path.exists(_join_cuda_home(lib_dir)) and
                           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Volumes/ExtremePro/fastchat-awq/env-fastchat-awq/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 2416, in _join_cuda_home
    raise OSError('CUDA_HOME environment variable is not set. '
OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root.
(env-fastchat-awq)Russells-MBP:kernels $
(env-fastchat-awq)Russells-MBP:fastchat-awq $

rdaigle007 avatar Oct 26 '23 23:10 rdaigle007

which is why I was assuming the following error occured for me while installing:

I also have this question

UIHCRITT avatar Nov 21 '23 08:11 UIHCRITT

which is why I was assuming the following error occured for me while installing:

I also have this question

I met this error too, how did you solve it?Thanks!

Yan0613 avatar Mar 27 '24 08:03 Yan0613

I copied the .so file to the current directory and tried again: image

Where did you get this file?

Stark-zheng avatar Apr 09 '24 06:04 Stark-zheng

which is why I was assuming the following error occured for me while installing:

I also have this question

I met this error too, how did you solve it?Thanks!

I gave up, autoawq is more convenient,https://github.com/casper-hansen/AutoAWQ

UIHCRITT avatar Apr 10 '24 02:04 UIHCRITT

which is why I was assuming the following error occured for me while installing:

I also have this question

I met this error too, how did you solve it?Thanks!

I gave up, autoawq is more convenient,https://github.com/casper-hansen/AutoAWQ

OK, thanks!

Yan0613 avatar Apr 10 '24 03:04 Yan0613