Taiwan-LLM icon indicating copy to clipboard operation
Taiwan-LLM copied to clipboard

Support for AWQ quantization in TGI

Open nigue3025 opened this issue 9 months ago • 1 comments

Hi As I tried with 13b version in TGI, it works fine with bitsandbytes quantization. While trying with AWQ quantization in TGI, it shows error as "Cannot load 'awq' weight, make sure the model is already quantized" I am wondering if AWQ is too new to this model while deploying by TGI Or there is any suggestion or comment? Thanks

nigue3025 avatar May 14 '24 01:05 nigue3025