onnx-tensorflow
onnx-tensorflow copied to clipboard
how to convert onnx model to saved model
I need to convert onnx model to saved model. I use the example which is ok but not saved model onnx_model = onnx.load(onnx_path) # load onnx model onnx.checker.check_model(onnx_model) tf_rep = prepare(onnx_model) tf_rep.export_graph(tf_path) # export the model
python version: 3.7 onnx version: 1.7 onnx_tf version: 1.5 tensorflow: 2.2
This graph is a frozen graph. So, you would need to convert it to a SavedModel graph. See if this one helps.
@HaoYang0123 We have updated our code base to produce a TF saved model. Both API export_graph and CLI convert will produce a saved model for you.
@winnietsang If you update this, you should document it.
@winnietsang And I didn't see this. https://github.com/onnx/onnx-tensorflow/blob/main/doc/API.md#onnx_tfbackend_reptensorflowrepexport_graph Now I want to know how can I export frozen graph? should I need to convert it to frozen graph use tensorflow API?Glad to have your reply.