Benjamin Bossan

Results 584 comments of Benjamin Bossan

> I'd suggest that we can start with a simple implementation of `add_weighted_adapter` That would be fantastic. Let's start with something simple and not try to have a "feature complete"...

> I have tested this, it is working Thanks for giving this a spin. If you have any numbers to share, like scores before and after merging, or even code,...

Indeed, DoRA is not supported for embeddings yet. Community contributions are welcome here, otherwise we may get to work on it some time in the future but there are no...

Thanks a lot for offering to take this @ariG23498. Yes, you got it exactly right on how to add the layer. In case you need more context on why DoRA...

Duplicate of #1504 :) Sorry about closing (wrong button).

No conclusion yet, we want to wait and see if the performance gains are indeed robust. Regarding your code, it's basically just a giant string with the code, right? Was...

Hey, after some discussion, I think we can proceed with this project. Let's add the `create_loraplus_optimizer` function but not the custom trainer class. We can put the function inside of...

@moghadas76 do you still plan on working on this?

Great, thanks. On top of what I mentioned, let's also move this to a new file. I'm thinking `src/peft/optimizers/loraplus.py`. The idea here is that we want to add more optimizer-related...

@moghadas76 Do you still plan on working on this?