onnx-tensorflow
onnx-tensorflow copied to clipboard
TF Model batch output is invalid
onnx model info:
model.graph.input: [name: "input_ids" type { tensor_type { elem_type: 6 shape { dim { dim_param: "batch_size" } dim { dim_value: 500 } } } } ]
model.graph.output: [name: "score_embed_outputs" type { tensor_type { elem_type: 1 shape { dim { dim_param: "batch_size" } dim { dim_value: 501 } } } } ]
TF model info:
The given SavedModel SignatureDef contains the following input(s): inputs['input_ids'] tensor_info: dtype: DT_INT32 shape: (-1, 500) name: serving_default_input_ids:0 The given SavedModel SignatureDef contains the following output(s): outputs['output_0'] tensor_info: dtype: DT_FLOAT shape: (-1, -1) name: StatefulPartitionedCall:0 Method name is: tensorflow/serving/predict
However, If my input batch size is less than 8 or greater than 8, it will cause a runtime exception: batch size(2) < 8:
{ "error": "assertion failed: [Gather indices are out of bounds, please double check the indices and retry.] [Condition x == y did not hold element-wise:] [x (LogicalAnd_2110:0) = ] [0] [y (assert_equal_4225/y:0) = ] [1]\n\t [[{{node assert_equal_4225/Assert/AssertGuard/Assert}}]]" }
batch size(9) > 8:
{ "error": "Input to reshape is a tensor with 4008 values, but the requested shape requires a multiple of 9\n\t [[{{node onnx_tf_prefix_Reshape_77559}}]]" }
Why is it 8, because of the following code(torch -> onnx):
# model_inputs:[1, 500] -> [repeat, 500]
model_inputs = model_inputs.repeat(8, 1)
outputs = torch_model.forward(model_inputs)
torch.onnx.export(torch_model,
model_inputs,
f=f"adjust_output_opest_12.onnx",
input_names=['input_ids'],
output_names=['score_embed_outputs'],
dynamic_axes={'input_ids': {0: 'batch_size'}, 'score_embed_outputs': {0: 'batch_size'}},
example_outputs=outputs,
opset_version=12)
If I repeat the input batch 8 times(first line), the TF model accepts 8 batch sizes. I have tried other integers like 3/4/5, but have the same issue.
model file onnx model tf model
Python, ONNX, ONNX-TF, Tensorflow version
Python version: 3.6.12 |Anaconda, Inc.| (default, Sep 8 2020, 23:10:56) [GCC 7.3.0] ONNX version: 1.9.0 ONNX-TF version: 1.8.0 Tensorflow version: 2.4.1
Additional context
Actually, my model's forward method is
def forward(self, inputs):
......
# outputs: [[1,501], ..., [1,501]] -> [batch, 501]
outputs = torch.cat(outputs)
return outputs
But when I convert the Torch model to Onnx model to TF model, the TF model's output shape is [repeat, 501](I want to [-1, 501]). So now the forward method is
def forward(self, inputs):
......
batch_size = outputs.size(0)
# outputs: [[1,501], ..., [1,501]] -> [batch, 501]
outputs = torch.cat(outputs)
# return outputs
return torch.reshape(outputs, [batch_size, -1])
Model's test input:
{"input_ids":[[101,7555,4680,1399,4917,8038,6935,6962,2356,3719,2399,1277,712,1814,1277,2356,3124,1825,4794,6392,3177,2456,6392,2339,4923,121,119,12893,9411,9137,9373,9559,8717,9131,8839,8020,924,7397,2595,857,2791,6981,1947,7433,3717,5052,5381,8021,3173,100,6662,8020,978,2434,6125,118,100,3777,1298,6662,8021,7433,3717,5052,6887,2339,4923,102,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]]}}