AutoAWQ
AutoAWQ copied to clipboard
TypeError: internvl_chat isn't supported yet.
Use autoawq to quant InternVL2-8B, the error occur:
TypeError: internvl_chat isn't supported yet.
when this model's quantization will be supported?