torchmetrics
torchmetrics copied to clipboard
Compatibility with the latest tokenizers package
🚀 Feature
It would be nice if torchmetrics would be compatible with the lastest tokenizers package. Currently, with the latest torchmetrics (0.10.0) and tokenizers (0.13.1) installed, I've got following error traceback (some information are omitted):
File ".../pytorch_lightning/callbacks/callback.py", line 25, in <module>
from pytorch_lightning.utilities.types import STEP_OUTPUT
......
File ".../torchmetrics/functional/text/helper_embedding_metric.py", line 26, in <module>
from transformers import AutoModelForMaskedLM, AutoTokenizer, PreTrainedModel, PreTrainedTokenizerBase
......
File ".../transformers/utils/versions.py", line 49, in _compare_versions
raise ImportError(
ImportError: tokenizers>=0.11.1,!=0.11.3,<0.13 is required for a normal functioning of this module, but found tokenizers==0.13.1.
Try: pip install transformers -U or pip install -e '.[dev]' if you're working with git main
Hi! thanks for your contribution!, great first issue!
Hi @function2-llx, thanks for reporting this issue. Could you please provide us with a reproducible piece of code? I use the package versions you mention and try to do the import and everything works fine for me 🤔
I also just tried and everything seems to be in order when I try to import
from transformers import AutoModelForMaskedLM, AutoTokenizer, PreTrainedModel, PreTrainedTokenizerBase
on v0.10.0 or master of torchmetrics.
Closing issue.
Hello,
I am sincerely sorry for my late reply. I'm not doing things well in recent weeks.
After a closer look, I think this issue is caused by the incompatibility between https://github.com/huggingface/transformers and https://github.com/huggingface/tokenizers. I should have upgraded the transformers first or submitted an issue in either of the huggingface's repository. Now I've upgraded the transformer package, and the issue has disappeared.
Once again, I'm so sorry for wasting your time. I'm really enjoying your work (including pytorch-lightning), hope I can do better when trying to help in the future.