AutoGPTQ
AutoGPTQ copied to clipboard
[BUG]Unable to quantize Falcon-7b
Describe the bug
https://huggingface.co/tiiuae/falcon-7b
Unable to quantize Falcon-7b model, throws an assertion error. auto_gptq/nn_modules/qlinear/qlinear_exllama.py: line 69 assert infeatures % self.group_size == 0
auto_gptq v0.7.1 transformers 4.40.0