江鸟
江鸟
Hello **@BenjaminBossan**, I’ve been dedicating my recent research to **LoRA and its variants**, and I'm eager to make my first open-source contribution right here in the PEFT project. Since your...
Hello @BenjaminBossan, Thanks for your guidance regarding contributing to PEFT. I’d like to start with the hyperparameter optimization–related experimental supplement task. My understanding is that the goal is not necessarily...
Thanks, @BenjaminBossan ! Understood. I’ll start with AdaLoRA and aim to improve accuracy while keeping memory and runtime reasonable. Once I have results and logs ready, I’ll open a PR...
@BenjaminBossan Thanks a lot for the suggestion! I’ll start with LoHa to get things running smoothly, and once I have some stable results, I’ll explore AdaLoRA and other PEFT methods.