transformers
transformers copied to clipboard
Allowing adding new token as unk token for gpt2 tokenizer
What does this PR do?
Fixes # (issue) #https://github.com/huggingface/transformers/issues/22414
Before submitting
- [yes ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [yes] Did you read the contributor guideline, Pull Request section?
- [yes] Was this discussed/approved via a Github issue or the forum? Please add a link to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the documentation guidelines, and here are tips on formatting docstrings.
- [ no] Did you write any new necessary tests?
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. @ArthurZucker
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.
I think we should add some tests to clarify what behavior is modified an how.
It could be for just those 4 tokenizers, but still I think the effect of this PR is not entirely clear by just reading it.
Also let's make sure that the tests are all green !
What do you need me to do to help this merge?
Hey! As mentioned in my last comments, the CI tests need to be all green 😉 Mostly make fixup
should help you
I remember testing for a few models and having some issues with this update in token addition, I'll have to check once the PR is ready
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.