efficientvit icon indicating copy to clipboard operation
efficientvit copied to clipboard

Request for Guidance on Integrating LoRA with Custom Attention Implementation

Open rotem154154 opened this issue 1 year ago • 0 comments

Hello, I'm enthusiastic about experimenting with your model, specifically applying Low-Rank Adaptation (LoRA) or similar methods for fine-tuning with a minimal number of parameters. My typical approach involves using tools like peft's LoraConfig, which automatically identifies and applies modifications to Linear layers within a model.

However, I've encountered a challenge with your custom attention mechanism implementation, which doesn't use standard Linear layers as seen in traditional ViT architectures. This deviation makes it unclear how to integrate LoRA, given my familiarity with augmenting Linear layers for this purpose.

Could you provide some insights or suggestions on how to adapt your model to support LoRA layers? Any guidance or examples would be greatly appreciated, as I'm eager to leverage LoRA's benefits with your innovative attention design.

Thank you for your time and consideration.

rotem154154 avatar Feb 21 '24 23:02 rotem154154