mlx-examples icon indicating copy to clipboard operation
mlx-examples copied to clipboard

Allow the entire model to be targed for LoRA and DoRA fine tuning: LoRA and DoRA embeddings with small DoRALinear bug fix

Open zaithottakath opened this issue 1 year ago • 0 comments

  • Added in LoRAEmbedding and DoRAEmbedding with tests so that embeddings can be targeted for fine tuning.
  • Added in the ability to target all Linear and Embedding modules regardless of if they are in model.layers allowing both the embeddings and the lm_head to be targeted for fine tuning, allowing a nearly full LoRA or DoRA fine tune of the model.
  • Fixed a bug with DoRALinear that sets the wrong self.m value due to it not being recalculated when the Linear layer is changed in DoRALinear.from_linear

I checked huggingface's PEFT library for how they handle DoRA for embeddings and there is still an open ticket for it. I wasn't able to find any reference implementations, so this could be the first example of that.

zaithottakath avatar Jul 25 '24 22:07 zaithottakath