LoRA icon indicating copy to clipboard operation
LoRA copied to clipboard

Initialization for LoRA weights A and B initialized in the wrong way.

Open arun477 opened this issue 5 months ago • 0 comments


    def reset_parameters(self):
        nn.Embedding.reset_parameters(self)
        if hasattr(self, 'lora_A'):
            # initialize A the same way as the default for nn.Linear and B to zero
            # lora_A should be normal and lora_B should be zeros
            nn.init.zeros_(self.lora_A)
            nn.init.normal_(self.lora_B)

arun477 avatar Jan 27 '24 15:01 arun477