AutoAWQ
AutoAWQ copied to clipboard
Support for QWEN 2.5 VL
More specifically, the currently latest version of autoawq pins transformers to an older version from December 2024, that did not support Qwen2.5-VL-72B-Instruct) (at least transformers==4.49.0 is required for this model).
For example:
# using the newest version of `transformers` supported by `autoawq` leads to a failure of importing Qwen2.5 VL classes:
pip install autoawq --user
[..]
pip install transformers --user
[..]
python -c "from transformers import Qwen2_5_VLForConditionalGeneration"
Traceback (most recent call last):
File "<string>", line 1, in <module>
ImportError: cannot import name 'Qwen2_5_VLForConditionalGeneration' from 'transformers' (/opt/conda/lib/python3.11/site-packages/transformers/__init__.py)
# versus:
# using the latest version of `transformers` (yet unsupported by `autoawq`) fixes the import failure:
pip install autoawq --user
[..]
pip install transformers==4.49.0 --user
Collecting transformers==4.49.0
[..]
# Installing collected packages: transformers
[..]
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
autoawq 0.2.8 requires transformers<=4.47.1,>=4.45.0, but you have transformers 4.49.0 which is incompatible.
Successfully installed transformers-4.49.0
# imports correctly, but would it work with AutoAWQ when using model versions quantized with AWQ?
python -c "from transformers import Qwen2_5_VLForConditionalGeneration; print(Qwen2_5_VLForConditionalGeneration.__name__)"
Qwen2_5_VLForConditionalGeneration
I found that using autoawq 0.2.7.post3 (instead of the latest one up to the moment, v0.2.8) the version conflict is not detected with transformers and you can get the latest transformer version.