LoRA icon indicating copy to clipboard operation
LoRA copied to clipboard

fix error of parameter initialization for LoRA embedding

Open hero007feng opened this issue 1 year ago • 1 comments

I found that the parameter initialization at reset_parameters() of the Embedding class differs from the LoRa paper and other implementations at layers.py. I initialized lora_A with nn.init.normal_() while Lora_B with nn.init.zeros_(). Thanks.

hero007feng avatar Mar 15 '23 08:03 hero007feng