LoRA icon indicating copy to clipboard operation
LoRA copied to clipboard

Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"

Results 106 LoRA issues
Sort by recently updated
recently updated
newest added

Bumps [tornado](https://github.com/tornadoweb/tornado) from 6.0.4 to 6.3.3. Changelog Sourced from tornado's changelog. Release notes .. toctree:: :maxdepth: 2 releases/v6.3.3 releases/v6.3.2 releases/v6.3.1 releases/v6.3.0 releases/v6.2.0 releases/v6.1.0 releases/v6.0.4 releases/v6.0.3 releases/v6.0.2 releases/v6.0.1 releases/v6.0.0 releases/v5.1.1 releases/v5.1.0...

dependencies

Hi! I am trying to use LoRA for my convolution layers: `self.conv = Conv2d(1, 16, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)`. I used lora counterpart of` nn. Conv2D` as`lora.Conv2d(n_chans_in,...

Hi, I've been trying to apply LoRA to the VITS model (hence the pull request for the conv1d). Turns out just using Lora for the text encoder transformer isn't enough,...

I train llama 13 in 8 3090 with lora. Model can be forwarded and backwarded. But when model get state dict, gpu is OOM.

Does lora can be run on M1 Pro MacBook? 14gb gpu

Thanks for your nice work. I am try to replicate result on webNLG, but the finnal epochs of checkpoint is only 11270, different from 20000. This results in a significant...

hi, I use the 'from_pretrain ' func to load the pretrain model ,but I found the linear param will be re-init when I simply replace the nn.Linear with lora.Linear

I understand why we need MergedLinear but is there a simple example of how the forward pass works for a MergedLinear? Specifically this line -> https://github.com/microsoft/LoRA/blob/main/loralib/layers.py#L248. I'm struggling to understand...

Are there any TF implementations available that you are aware of? Also, do you see any specific limitations in converting this repo to TF?

Dear Edward, Thanks for your contribution to the community. But I couldn't re-implement your experiments by using the scripts you wrote in the LoRA/examples/NLG. I feel down and don't know...