AutoAWQ
AutoAWQ copied to clipboard
cant import awq
from the new version,I build it but I cant import awq,
- transformers 4.43.3
- torch 2.3.1
- torchaudio 2.4.0
- torchvision 0.19.0
- autoawq 0.2.6
- autoawq_kernels 0.0.7
error like this
>>> import awq
Traceback (most recent call last):
File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1586, in _get_module
return importlib.import_module("." + module_name, self.__name__)
File "/nas/djh/miniconda3/envs/awq/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/models/auto/processing_auto.py", line 28, in <module>
from ...image_processing_utils import ImageProcessingMixin
File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/image_processing_utils.py", line 21, in <module>
from .image_transforms import center_crop, normalize, rescale
File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/image_transforms.py", line 22, in <module>
from .image_utils import (
File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/image_utils.py", line 58, in <module>
from torchvision.transforms import InterpolationMode
File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/torchvision/__init__.py", line 10, in <module>
from torchvision import _meta_registrations, datasets, io, models, ops, transforms, utils # usort:skip
File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/torchvision/_meta_registrations.py", line 163, in <module>
@torch.library.register_fake("torchvision::nms")
AttributeError: module 'torch.library' has no attribute 'register_fake'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/nas/djh/kernels/AutoAWQ/awq/__init__.py", line 2, in <module>
from awq.models.auto import AutoAWQForCausalLM
File "/nas/djh/kernels/AutoAWQ/awq/models/__init__.py", line 1, in <module>
from .mpt import MptAWQForCausalLM
File "/nas/djh/kernels/AutoAWQ/awq/models/mpt.py", line 1, in <module>
from .base import BaseAWQForCausalLM
File "/nas/djh/kernels/AutoAWQ/awq/models/base.py", line 35, in <module>
from transformers import (
File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1577, in __getattr__
value = getattr(module, name)
File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1576, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1588, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.auto.processing_auto because of the following error (look up to see its traceback):
module 'torch.library' has no attribute 'register_fake'
>>>
I think you may have an issue with your torch installation. Try to reinstall torch
Got the same issue. I installed torch using conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia
and cloned the autoawq repo and installed it from source.
Got the same issue. I installed torch using
conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidiaand cloned the autoawq repo and installed it from source.
hello,I dont know which torch suit me,so I pip install autoawq and then uninstall it,later I built awq from source ,Its OK。
I reinstalled torch and still the issue persists. I tried installing from the source as well but didn't get it working.
@casper-hansen any idea how to overcome this?
autoawq_kernels==0.0.7
torch==2.3.1
torchaudio==2.4.0+cu118
torchvision==0.19.0+cu118
I am using an EC2 instance (g5.24xlarge) i.e 4xA10s
NVIDIA-SMI 535.183.01 Driver Version: 535.183.01 CUDA Version: 12.2
@NamburiSrinath try this:
- Create a new python env
- Install torch with pip
- Install AutoAWQ from source but with
--no-build-isolationargument
Meet the same error, solved by upgrading Pytorch to 2.4.
Meet the same error, solved by upgrading Pytorch to 2.4.
thanks.problem solved