peft icon indicating copy to clipboard operation
peft copied to clipboard

DoRA support for Embedding

Open ShayekhBinIslam opened this issue 1 year ago • 3 comments

Feature request

Supporting DoRA with Embedding to allow seamless integration with the LoRA framework.

Motivation

I need to finetune a model where additional tokens have to be trained. But getting the "Embedding does not support DoRA yet, please set it to False" error.

Your contribution

No

ShayekhBinIslam avatar Apr 24 '24 19:04 ShayekhBinIslam

Indeed, DoRA is not supported for embeddings yet. Community contributions are welcome here, otherwise we may get to work on it some time in the future but there are no concrete plans yet.

BenjaminBossan avatar Apr 25 '24 08:04 BenjaminBossan

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

github-actions[bot] avatar May 25 '24 15:05 github-actions[bot]

We're eager to receive a PR from the community 🤗! I am also having this issue.

bezir avatar Jul 08 '24 20:07 bezir

Hey @BenjaminBossan I would like to try my hand at this.

As I understand it, I would need to build the DoRA Embedding layer in dora.py and then link that implementation in lora/layer.py.

ariG23498 avatar Aug 13 '24 14:08 ariG23498

Thanks a lot for offering to take this @ariG23498. Yes, you got it exactly right on how to add the layer. In case you need more context on why DoRA is implemented as it is (which looks more complicated than needed), you can check #1806, but this background info should not be needed to work on this.

BenjaminBossan avatar Aug 13 '24 15:08 BenjaminBossan