transformers
transformers copied to clipboard
TF to ONNX export fails with CLI using example from docs
System Info
-
transformers
version: 4.21.1 - Platform: Linux-4.15.0-187-generic-x86_64-with-debian-buster-sid
- Python version: 3.7.5
- Huggingface_hub version: 0.8.1
- PyTorch version (GPU?): not installed (NA)
- Tensorflow version (GPU?): 2.7.0 (False)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?: No
- Using distributed or parallel set-up in script?: No
Who can help?
No response
Information
- [X] The official example scripts
- [ ] My own modified scripts
Tasks
- [ ] An officially supported task in the
examples
folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below)
Reproduction
- Save a TF transformers model (from example at https://huggingface.co/docs/transformers/serialization)
from transformers import AutoTokenizer, TFAutoModelForSequenceClassification
# Load tokenizer and TensorFlow weights from the Hub
tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased")
tf_model = TFAutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased")
# Save to disk
tokenizer.save_pretrained("local-tf-checkpoint")
tf_model.save_pretrained("local-tf-checkpoint")
- Use CLI to export to ONNX to see failure:
python -m transformers.onnx --model=local-tf-checkpoint onnx/
- Use
--framework
to use successfully:python -m transformers.onnx --model=local-tf-checkpoint --framework=tf onnx/
Expected behavior
Once the model directory has been provided, the export should know that a TF model is being used. There should be no dependency on PyTorch (there is also no PyTorch in this environment). Instead, I get this error: RuntimeError: Cannot export model to ONNX using PyTorch because no PyTorch package was found.
Either transformers
should be updated or the docs at https://huggingface.co/docs/transformers/serialization should be updated to say that --framework=tf
for TensorFlow models is required.
Hmmm that's interesting, indeed!
The docs should be updated, but it would be nice to also support this out of the box. Would you like to try your hand at a PR?
cc @lewtun @michaelbenayoun @JingyaHuang for knowledge
Sure, I can try making a PR for it! Will be doing so from my personal account, @rachthree.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Because PR https://github.com/huggingface/transformers/pull/18615 has been merged, I'm considering this closed.