GPTQ-for-LLaMa
GPTQ-for-LLaMa copied to clipboard
LoRa and diff with bitsandbytes
- What changes would I need to make for GPTQ to support LoRa for Llama 2?
- What's the main difference between GPTQ vs bitsandbytes? Is it that GPTQ re-adjusts the weights to keep the same loss function shape?