onnx-tensorflow
onnx-tensorflow copied to clipboard
onnx_tf.backend.prepare for protobuf with External Data > 2GB
Describe the bug
onnx_tf.backend.prepare fails for files that are smaller than 2GB that reference external data that cause them to grow beyond 2GB. Due to https://github.com/onnx/onnx-tensorflow/blob/b13a282f6c404df04a2ae5aaa24ea5ec6d2d4e64/onnx_tf/backend.py#L67 and https://github.com/onnx/onnx/blob/c24cd429c4ab47d2b057da8174788c39c60f8760/onnx/backend/base.py#L76 To check a model larger than 2GB one must use a path.
To Reproduce Open a valid modelproto with onnx.load that has a size < 2GB that references External Tensor Data. The doc, onnx-tensorflow/doc/API.md, only shows this method is available.
Python, ONNX, ONNX-TF, Tensorflow version
- Python version: 3.7.8
- ONNX version: 1.9.0
- ONNX-TF version: 1.8.0
- Tensorflow version: 2.5.0
backtrace File "./onnxTests.py", line 27, in onnxTfRes output = prepare(onnxModel).run(data) File "/home/aron/.local/lib/python3.7/site-packages/onnx_tf/backend.py", line 67, in prepare super(TensorflowBackend, cls).prepare(model, device, **kwargs) File "/home/aron/.local/lib/python3.7/site-packages/onnx/backend/base.py", line 76, in prepare onnx.checker.check_model(model) File "/home/aron/.local/lib/python3.7/site-packages/onnx/checker.py", line 101, in check_model protobuf_string = model.SerializeToString() ValueError: Message onnx.ModelProto exceeds maximum protobuf size of 2GB: 3071182436
Onnx model with external data is only supported using command line as described, https://github.com/onnx/onnx-tensorflow/blob/master/doc/CLI.md. Please use -e or --extdatadir in onnx-tf command and see if it helps.
I can confirm that with the extdatadir flag using the CLI the issue is still present
Smaller models export OK, but larger Transformer models do not seem to work