Conversion onnx-tf-openvino does not work
Dear all,
I have a working .onnx model (opset 11) and I want to convert it into a .pb model. I used the following command for the conversion:
onnx-tf convert -i retinanet1.onnx -o .
This command produces an output, but when I use it for the conversion in OpenVINO (version 2020.1), it produces the following error:
[ FRAMEWORK ERROR ] Cannot load input model: TensorFlow cannot read the model file:
"/Users/massimilianodatres/Documents/catchme_mmdet/ONNX_models/retinanet/saved_model.pb" is incorrect TensorFlow model file.
The file should contain one of the following TensorFlow graphs:
- frozen graph in text or binary format
- inference graph for freezing with checkpoint (--input_checkpoint) in text or binary format
- meta graph
It seems that the frozen graph contained in .pb file is not correct. My environment during conversion is the following:
tensorflow 1.14.0 onnx 1.8.1 onnx-tf 1.7.0
I hope that the problem is well explained. Can you help me?
PS: I know that I can convert directly the .onnx model to openVINO, but unfortunately retinanet is not supported in openVINO version 2020.1, while fizyr/retinanet is supported. I do not use fizyr/retinanet but the mmdetection one (that is in pytorch), and so I have thought to convert it in .onnx and after in .pb. I can not update OpenVINO version.
Thanks in advance, mdatres
The current onnx-tf doesn't produce a frozen graph anymore. Instead, a SavedModel is created including a .pb file. However that pb file is to be used by Tensorflow SavedModel API; therefore cannot be consumed by OpenVINO.
Thanks for the reply. I have a onnx opset 12 model. Is there some release that allows me to convert it into a tensorflow 1.14 frozen .pb graph?
Best regards, mdatres
The current onnx-tf doesn't produce a frozen graph anymore. Instead, a SavedModel is created including a .pb file. However that pb file is to be used by Tensorflow SavedModel API; therefore cannot be consumed by OpenVINO.
I think that it would be useful to add a parameter that allows exporting the model in a frozen graph if it is possible.
Agree. Hopefully someone can look into that option soon. @mdatres You could make a source build using the "tf-1.x" branch. I believe it exports into a frozen graph. I just don't know for sure if it works for tf 1.14.
Agree. Hopefully someone can look into that option soon. @mdatres You could make a source build using the "tf-1.x" branch. I believe it exports into a frozen graph. I just don't know for sure if it works for tf 1.14.
I would also very interested in the functionality of exporting a frozen graph. Installing the current version of https://github.com/onnx/onnx-tensorflow/tree/tf-1.x with tensorflow 1.15 and trying to convert a model with opset9 results in the error [onnx.onnx_cpp2py_export.defs.SchemaError: No schema registered for 'BitShift'!].
Is this the case for everyone, or is someone able to export frozen pb at the moment?
Agree. Hopefully someone can look into that option soon. @mdatres You could make a source build using the "tf-1.x" branch. I believe it exports into a frozen graph. I just don't know for sure if it works for tf 1.14.
I would also very interested in the functionality of exporting a frozen graph. Installing the current version of https://github.com/onnx/onnx-tensorflow/tree/tf-1.x with tensorflow 1.15 and trying to convert a model with opset9 results in the error [onnx.onnx_cpp2py_export.defs.SchemaError: No schema registered for 'BitShift'!].
Is this the case for everyone, or is someone able to export frozen pb at the moment?
I got the same error. I guess it is due to the version of onnx. See https://github.com/onnx/onnx-tensorflow/issues/865#issuecomment-782592458