Murphy_Juan

Results 4 comments of Murphy_Juan

I have the same issue :(

I think you need to add `is_trainable=True` when you use PeftModel: `model = PeftModel.from_pretrained(model, './run_alpaca_lora/alpaca-lora-ckpt', torch_dtype=torch.float16, device_map={'': 0}, is_trainable=True )` > When doing finetuning, Lora-weight is loaded with peft without...

I got the same issue >