GPTQ-for-LLaMa
GPTQ-for-LLaMa copied to clipboard
What is the right perplexity number?
for base FP16 model --eval gives 5.68 PPL on wikitext2 --benchmark 2048 gives 6.43 on wikitext2
What's the difference?