LoRA
LoRA copied to clipboard
Probably a bug in the lora embeding class in loralib/layers.py
Is there a bug on that line or incorrect comment? https://github.com/microsoft/LoRA/blob/3f5c193f431c8a09448f0184f6f883ad393f22d0/loralib/layers.py#L59C27-L59C27
The names of the fields are exchanged
# initialize A the same way as the default for nn.Linear and B to zero
nn.init.zeros_(self.lora_A)
nn.init.normal_(self.lora_B)
Nice catch, I saw this too and came to check out if there are any issues pertaining to this! I am sure it is simply a typo, but I'd wait for the authors to respond too to clarify this.
Ahh just stumbled upon this where the authors responded: https://github.com/microsoft/LoRA/issues/114