onnxconverter-common icon indicating copy to clipboard operation
onnxconverter-common copied to clipboard

resize op convert to FP16 fail

Open nistarlwc opened this issue 1 year ago • 4 comments

There is a model from tensorflow2onnx, the FP32 model can run successfully.

Then use float16_converter.convert_float_to_float16(onnx_model, keep_io_types=True) convert to FP16 model. But the FP16 model can't create session, error: onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from model_fp16.onnx failed:Node (Resize__846) Op (Resize) [ShapeInferenceError] Either sizes or scales must be provided, but not both of them

The problem is similar with #266.
How to solve it ?

nistarlwc avatar Feb 19 '24 00:02 nistarlwc