onnxconverter-common
onnxconverter-common copied to clipboard
resize op convert to FP16 fail
There is a model from tensorflow2onnx, the FP32 model can run successfully.
Then use float16_converter.convert_float_to_float16(onnx_model, keep_io_types=True)
convert to FP16 model.
But the FP16 model can't create session, error:
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from model_fp16.onnx failed:Node (Resize__846) Op (Resize) [ShapeInferenceError] Either sizes or scales must be provided, but not both of them
The problem is similar with #266.
How to solve it ?