LoRA icon indicating copy to clipboard operation
LoRA copied to clipboard

Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"

Results 106 LoRA issues
Sort by recently updated
recently updated
newest added

Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.5 to 1.26.18. Release notes Sourced from urllib3's releases. 1.26.18 Made body stripped from HTTP requests changing the request method to GET after HTTP 303 "See Other"...

dependencies

Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.0.2 to 3.1.37. Release notes Sourced from gitpython's releases. 3.1.37 - a proper fix CVE-2023-41040 What's Changed Improve Python version and OS compatibility, fixing deprecations by @​EliahKagan...

dependencies

Dear All We are implementing a multi-lora framework to support fine tune llms with same base model in one GPU. We are so glad to work with the community to...

Bumps [pillow](https://github.com/python-pillow/Pillow) from 8.3.2 to 10.0.1. Release notes Sourced from pillow's releases. 10.0.1 https://pillow.readthedocs.io/en/stable/releasenotes/10.0.1.html Changes Updated libwebp to 1.3.2 #7395 [@​radarhere] Updated zlib to 1.3 #7344 [@​radarhere] 10.0.0 https://pillow.readthedocs.io/en/stable/releasenotes/10.0.0.html Changes...

dependencies

I'm confused about why T(w) is the inner function https://github.com/microsoft/LoRA/blob/dc5d1744fa9430edda10bc233a9efc65e9239f50/loralib/layers.py#L128 torch.jit.script will report error: torch.jit.frontend.UnsupportedNodeError: function definitions aren't supported: def forward(self, x: torch.Tensor): def T(w): ~~~ 0 and not self.merged:...

Hi, I tried to replace our model's linear layer with lora.linear. However, it seems that all of the components in this module cannot be used for finetuning. ``` --------------------------------------------------------------------------- RuntimeError...

Added code to the `mark_only_lora_as_trainable` method to set `requires_grad` to True for parameters with `lora_` in the name. This change was made for the following reason. Sometimes `requires_grad` may be...

Is there a bug on that line or incorrect comment? https://github.com/microsoft/LoRA/blob/3f5c193f431c8a09448f0184f6f883ad393f22d0/loralib/layers.py#L59C27-L59C27 The names of the fields are exchanged ``` # initialize A the same way as the default for nn.Linear...

It is very excellect code and I have used it for almost anywork and it seems reduce my cost. However, I still have some questions when using it, if possibile,...