[PEFT] Add warning for missing key in LoRA adapter
What does this PR do?
When loading a LoRA adapter, so far, there was only a warning when there were unexpected keys in the checkpoint. Now, there is also a warning when there are missing keys.
This change is consistent with https://github.com/huggingface/peft/pull/2118 in PEFT and the planned PR https://github.com/huggingface/diffusers/pull/9622 in diffusers.
Apart from this change, the error message for unexpected keys was slightly altered for consistency (it should be more readable now). Also, besides adding a test for the missing keys warning, a test for unexpected keys warning was also added, as it was missing so far.
Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the contributor guideline, Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the forum? Please add a link to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the documentation guidelines, and here are tips on formatting docstrings.
- [x] Did you write any new necessary tests?
@ArthurZucker Could you please review or suggest a reviewer?
Ping @ArthurZucker
@SunMarc can you take a look as well if you have some bandwidth?
Thanks for the review.
Thanks, I am wondering why we won't init the model with adapter, load with transformers, then just do the matching we usually have in transformers, but that's more for me!
I'm not sure what you mean there, but if there is something I can look into, LMK.
Is there anything missing to merge the PR? If not, as I don't have rights, could you merge please :bow:
Don't worry it's for me ! 🤗