peft icon indicating copy to clipboard operation
peft copied to clipboard

FEAT Add sine-LoRA #2434

Open yipingji opened this issue 8 months ago • 12 comments

Hi,

I just implemented sine LoRA in variants.py and changed "resolve_lora_variant" a bit in layers.py.

yipingji avatar Mar 27 '25 06:03 yipingji

@githubnemo Hi, I am still a bit confused how to add module.sinelora_frequency = kwargs['sinelora_frequency'] and sinelora_scaling. It returns key error when I put module.sinelora_frequency = kwargs['sinelora_frequency'] in class SineLoraLinearVariant(LoraVariant):

yipingji avatar Mar 31 '25 14:03 yipingji

Hi @yipingji

Sorry for being unclear. I tried highlighting the issue in https://github.com/huggingface/peft/pull/2457#discussion_r2020957043. The kwargs come from the update_layer calls, the update_layer calls get their parameters from the layer __init__ calls. As long as they don't have sinelora_frequency and sinelora_scaling they will be missing from the kwargs in SineLora*Variant.init as well.

So you have to update the following functions with new parameters the same way you did to add the use_sinelora parameter:

  • LoraLayer.update_layer
  • Linear.__init__
  • Linear.update_layer
  • Embedding.__init__
  • Embedding.update_layer

For example, the change for LoraLayer and Linear could look like this:

diff --git a/src/peft/tuners/lora/layer.py b/src/peft/tuners/lora/layer.py
index cb09608..efd8d6a 100644
--- a/src/peft/tuners/lora/layer.py
+++ b/src/peft/tuners/lora/layer.py
@@ -183,6 +183,8 @@ class LoraLayer(BaseTunerLayer):
         use_rslora,
         use_dora: bool = False,
         use_sinelora: bool = False,
+        sinelora_frequency = 200.0,
+        sinelora_scaling = None,
         lora_bias: bool = False,
     ):
         # collect the kwargs
@@ -574,6 +576,8 @@ class Linear(nn.Module, LoraLayer):
         use_dora: bool = False,
         lora_bias: bool = False,
         use_sinelora: bool = False,
+        sinelora_frequency: float = 200.0,
+        sinelora_scaling: Optional[float] = None,
         **kwargs,
     ) -> None:
         super().__init__()
@@ -591,6 +595,8 @@ class Linear(nn.Module, LoraLayer):
             use_dora=use_dora,
             lora_bias=lora_bias,
             use_sinelora=use_sinelora,
+            sinelora_frequency=sinelora_frequency,
+            sinelora_scaling=sinelora_scaling,
         )
         self.is_target_conv_1d_layer = is_target_conv_1d_layer

I hope that helps!

To test the Embedding layer implementation you could add

("Embedding + transformers Conv1D 2 LoRA + SineLoRA", "EmbConv1D", LoraConfig, {"target_modules": ["emb"], "use_sinelora": True}),

to the custom models testcase you already added.

githubnemo avatar Apr 01 '25 12:04 githubnemo

Hi @githubnemo, apologies for the previous update and it is my first time to PR in such a large codebase. I was just wondering if my current one is correct.

yipingji avatar Apr 01 '25 13:04 yipingji

@yipingji gentle ping :)

githubnemo avatar Apr 15 '25 16:04 githubnemo

@yipingji gentle ping :)

Sorry for the late updates. I am working on conference recently and will PR soon;)

yipingji avatar Apr 17 '25 04:04 yipingji

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

github-actions[bot] avatar May 11 '25 15:05 github-actions[bot]

Hi @githubnemo ,

How can i reopen the PR

yipingji avatar May 21 '25 03:05 yipingji

@yipingji I've reopened the PR and added the WIP tag so stale bot will not bother us.

githubnemo avatar May 21 '25 22:05 githubnemo

@githubnemo I just updated and please have a look;)

yipingji avatar May 22 '25 04:05 yipingji

gentle ping @githubnemo ;)

yipingji avatar May 28 '25 11:05 yipingji

Hey @yipingji, do you still plan on implementing this further?

githubnemo avatar Jul 30 '25 14:07 githubnemo

Yes I plan to implement this further but I’m quite busy with my project recently and I will pr next month:)

On Wed, Jul 30, 2025 at 11:52 PM githubnemo @.***> wrote:

githubnemo left a comment (huggingface/peft#2457) https://github.com/huggingface/peft/pull/2457#issuecomment-3136584942

Hey @yipingji https://github.com/yipingji, do you still plan on implementing this further?

— Reply to this email directly, view it on GitHub https://github.com/huggingface/peft/pull/2457#issuecomment-3136584942, or unsubscribe https://github.com/notifications/unsubscribe-auth/BI7NMWDNKHD3CFRDVWZVPJT3LDII3AVCNFSM6AAAAABZ4FA5C2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTCMZWGU4DIOJUGI . You are receiving this because you were mentioned.Message ID: @.***>

yipingji avatar Jul 30 '25 14:07 yipingji