LoRA icon indicating copy to clipboard operation
LoRA copied to clipboard

the pretrained-layer's param has been re-init

Open CuddleSabe opened this issue 1 year ago • 5 comments

hi, I use the 'from_pretrain ' func to load the pretrain model ,but I found the linear param will be re-init when I simply replace the nn.Linear with lora.Linear

CuddleSabe avatar Apr 03 '23 03:04 CuddleSabe