LLaMA-Factory
LLaMA-Factory copied to clipboard
MoRA: High-Rank Updating for Parameter-Efficient Fine-Tuning
https://arxiv.org/abs/2405.12130
MoRA, a PEFT technique that uses a square matrix instead of low-rank matrices. The main idea behind MoRA is to use trainable parameters in a way that achieves the highest possible rank in the space of the model’s original dimensions.
MoRA significantly outperformed LoRA and came much closer to the performance of a fully fine-tuned model with fewer parameters and training steps.