What to use for `model_id` when loading from a model trained locally and not published?
Do we have to push the model to hugginface to be able to load a model trained locally?
Hello @philharmonikerzzy, that isn't necessary, just pass the path to the local folder
Thank you @pacman100 for the quick answer!
When I completed the training I also did not use save_pretrained() to save the LORA weights separately, but instead used the trainer.save_model(), which seems to have saved the entire model's weights, including the additional new LORA weights, into a directory.
How should I load the PEFT model in this case? Or should I rerun the fine-tuning to make sure I explicitly calls save_pretrained()?
You need not retrain, it's just that you could have save a lot of storage space with save_pretrained(). You can load the whole model weights with load_pretrained without any issues
could you provide an example of how to pass a path to PeftModel.from_pretrained()? when i do PeftModel.from_pretrained(model, [local-path]) im getting Repository Not Found for url:
based on the implementation here I don't see how i could load the whole model with the LORA weights using the from_pretrained method.
https://github.com/huggingface/peft/blob/main/src/peft/peft_model.py#L148
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.