serve
serve copied to clipboard
Update Huggingface_Transformers examples for safetensors
📚 The doc issue
In examples/Huggingface_Transformers/README.md, the torch-model-archiver commands specify --serialized-file Transformer_model/pytorch_model.bin option-arguments. However, the Download_Transformer_models.py script in the same directory currently results in a downloaded model in the safetensors format:
$ python Download_Transformer_models.py
...
Save model and tokenizer/ Torchscript model based on the setting from setup_config bert-base-uncased in directory ./Transformer_model
$ ls -1 Transformer_model/
config.json
model.safetensors
special_tokens_map.json
tokenizer_config.json
tokenizer.json
vocab.txt
Suggest a potential alternative/fix
Perhaps update the examples to accommodate both formats.
Makes sense, @HamidShojanazeri @mreso is one of you interested in this?
How do you manage to solve this? I still don't know how torch-model-archiver works with safetensors.