Bogeum Kim

Results 35 comments of Bogeum Kim

Hello 👋 I am interested in this project so I want to participate!

@lewtun okay I see :) Is there a deadline or do I just have to tell you when I'm done?

I'm interested in this PR, so I was wondering if anyone is currently working on this.

I'm having the same problem, any idea how to fix it? Transformers and FlagEmbedding version: - `transformers==4.51.3` - `flagembedding==1.3.4`

@BenjaminBossan Hi, is this still in progress? It looks like the work might have stopped midway, so if this is still a valid issue, I'd like to continue working on...

Okay. That sounds like an interesting task. I'll start by writing and testing some example code, using the exact same model as the user who first reported this issue.

I ran the code in Colab and encountered two different scenarios. **The first scenario** occurred when I ran the original user's code exactly as they provided it: I got an...

Thanks for you quick reply. So all I need to do is add KaSA to `peft/tuners/lora/variants.py` using the code from [existing implementation of KaSA](https://github.com/juyongjiang/KaSA/blob/f85e88c22d0fa4cb8ab2923d7c2bf1bbec152da3/peft/src/peft/tuners/lora/layer.py#L130) and refer to #2443 you mentioned...