LoRA
LoRA copied to clipboard
fix error of parameter initialization for LoRA embedding
I found that the parameter initialization at reset_parameters() of the Embedding class differs from the LoRa paper and other implementations at layers.py. I initialized lora_A with nn.init.normal_() while Lora_B with nn.init.zeros_(). Thanks.