LoRA
LoRA copied to clipboard
lora-dim == lora-r ?
Hi I am studing on LoRA and thanks for your work. I have a simple question, which is really confusing me. Dose the two hyper-parameters lora-dim of the GPT-2 model (in the generative task) and lora-r of the RoBERTa model (in the NLU task) serve the same role just for different tasks? Is it correct that i only changing the value of lora-dim to make the GPT model woring wtih different tank?