minLoRA icon indicating copy to clipboard operation
minLoRA copied to clipboard

minLoRA: a minimal PyTorch library that allows you to apply LoRA to any PyTorch model.

Results 7 minLoRA issues
Sort by recently updated
recently updated
newest added

Minimum example ```python import torch import timm from torch import nn from minlora import add_lora, get_lora_params, get_lora_state_dict model_timm = timm.create_model("vit_large_patch14_clip_336.openai", pretrained=True, num_classes=0, global_pool='avg') add_lora(model_timm) model_timm = nn.DataParallel(model_timm, device_ids=[0,1]).cuda() with torch.no_grad():...

I noticed that you apply the mul operation in LoraA and LoraB, then, you sum the result with the input. ![image](https://github.com/cccntu/minLoRA/assets/6534458/0e3e4e40-c861-4289-bffc-480ee85bf40a) I think the result of multiplying LoraA and LoraB...

您好,请问您对blip2模型的加速研究有没有什么进展和思路,是否可以交流一下呢,万分感谢

Hi, thank you for your great work. I want to use yours for my experiment. I wonder get_lora_params() would load parameters to optimizer, but if the model itself can compute...

Can this be used with FSDP? I haven't seen any examples of using `torch.nn.utils.parametrize` with FSDP.

Hey there, This looks like a cool project! How do we use this for nanoGPT? :)