GPTQ-for-LLaMa icon indicating copy to clipboard operation
GPTQ-for-LLaMa copied to clipboard

Finetuning Quantized LLaMA

Open Qifeng-Wu99 opened this issue 2 years ago • 0 comments

Hello,

I really appreciate your work done here.

I wonder if you could also release a python script on finetuning quantized LLaMA on a customized dataset.

It is inevitable that quantization would damage performance, while finetuning could make the model perform better on a user-desired dataset.

Thank you.

Qifeng-Wu99 avatar Jun 10 '23 03:06 Qifeng-Wu99