Filip
Results
1
comments of
Filip
> Hi, If anyone wants try GPTQ quantizationo in vLLM. Please use this repo [QLLM](https://github.com/wejoncy/QLLM) to quantize model(LLama) and it would compatiable AWQ in vLLM. And Of courcr you can...