Murphy_Juan
Murphy_Juan
I have the same issue :(
I think you need to add `is_trainable=True` when you use PeftModel: `model = PeftModel.from_pretrained(model, './run_alpaca_lora/alpaca-lora-ckpt', torch_dtype=torch.float16, device_map={'': 0}, is_trainable=True )` > When doing finetuning, Lora-weight is loaded with peft without...
I have the same problem too ðŸ˜.
I got the same issue >