Will Lamond
Results
2
comments of
Will Lamond
> @wlamond If we're going to `LoRA`, then why not just go all the way and do `QLoRA` :) > > It's a very simple change to your PR, just...
> @ecr23xx Another possible improvement: the original parameters doesn't need to be stored in the optimizer during lora finetuning. The `configure_optimizers` method only passes parameters with `requires_grad == True` to...