LoRA
LoRA copied to clipboard
the pretrained-layer's param has been re-init
hi, I use the 'from_pretrain ' func to load the pretrain model ,but I found the linear param will be re-init when I simply replace the nn.Linear with lora.Linear