peft
peft copied to clipboard
DoRA support for Embedding
Feature request
Supporting DoRA with Embedding to allow seamless integration with the LoRA framework.
Motivation
I need to finetune a model where additional tokens have to be trained. But getting the "Embedding does not support DoRA yet, please set it to False" error.
Your contribution
No
Indeed, DoRA is not supported for embeddings yet. Community contributions are welcome here, otherwise we may get to work on it some time in the future but there are no concrete plans yet.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
We're eager to receive a PR from the community 🤗! I am also having this issue.
Hey @BenjaminBossan I would like to try my hand at this.
As I understand it, I would need to build the DoRA Embedding layer in dora.py and then link that implementation in lora/layer.py.
Thanks a lot for offering to take this @ariG23498. Yes, you got it exactly right on how to add the layer. In case you need more context on why DoRA is implemented as it is (which looks more complicated than needed), you can check #1806, but this background info should not be needed to work on this.