optimum icon indicating copy to clipboard operation
optimum copied to clipboard

How to convert a model(tf_model.h5) with tokenizer folder to the onnx format

Open pradeepdev-1995 opened this issue 1 year ago • 7 comments

Feature request

I have trained the TensorFlow model using the Transformers library and saved the trained model and tokenizer in a folder named MODEL_WITH_TOKENIZER. The model is stored inside the folder in a .h5 format - tf_model.h5 Here is the folder structure. Screenshot from 2024-03-26 16-17-28

I want to convert the model to .onnx format Should I convert the entire MODEL_WITH_TOKENIZER folder to .onnx or only the tf_model.h5 file to onnx? what are the steps

Motivation

Hi, I have trained the TensorFlow model using the Transformers library and saved the trained model and tokenizer in a folder named MODEL_WITH_TOKENIZER. The model is stored in the .h5 format - model.h5 Here is the folder structure. Screenshot from 2024-03-26 16-17-28 I want to convert the model to .onnx format Should I convert the entire MODEL_WITH_TOKENIZER folder to .onnx or only the tf_model.h5 file to onnx? what are the steps

Your contribution

I have trained the TensorFlow model using the Transformers library and saved the trained model and tokenizer in a folder named MODEL_WITH_TOKENIZER. The model is stored in the .h5 format - tf_model.h5 Here is the folder structure. Screenshot from 2024-03-26 16-17-28 I want to convert the model to .onnx format Should I convert the entire MODEL_WITH_TOKENIZER folder to .onnx or only the tf_model.h5 file to onnx? what are the steps

pradeepdev-1995 avatar Mar 26 '24 10:03 pradeepdev-1995

@fxmarty any info?

pradeepdev-1995 avatar Mar 28 '24 04:03 pradeepdev-1995

@pradeepdev-1995 Thank you for the request. You can give a try to pip install optimum[exporters-tf] and

optimum-cli export onnx --model /path/to/your/model --framework tf onnx_output/

should (hopefully) work. You may need to specify the --task argument as well.

fxmarty avatar Mar 28 '24 09:03 fxmarty

@fxmarty I am using an Ubuntu machine and after installing with this pip install optimum[exporters-tf] found two issues

1 - Error optimum-cli: command not found 2 - The model object in

optimum-cli export onnx --model /path/to/your/model --framework tf onnx_output/

should be the tf_model.h5 file or the folder MODEL_WITH_TOKENIZER (which contains all the model and tokenizer files that I put in the screenshot)

pradeepdev-1995 avatar Mar 28 '24 09:03 pradeepdev-1995

About 1., can you try: pip uninstall optimum && pip install optimum[exporters-tf]. What are the install logs?

About 2., I'll try to reproduce, thank you.

fxmarty avatar Mar 28 '24 09:03 fxmarty

@fxmarty I am using ubuntu machine with Python 3.8.16 . The pip install optimum[exporters-tf] not installing it showcasing the given error Screenshot from 2024-03-28 16-07-34 So installed via pip install optimum[all] which didn't raise any error and was installed successfully. So should i must use pip install optimum[exporters-tf] itself?

pradeepdev-1995 avatar Mar 28 '24 10:03 pradeepdev-1995