onnxconverter-common
onnxconverter-common copied to clipboard
[Error] Load fp16
I successfully converted my model to fp16:
import onnx
from onnxconverter_common import float16
model = onnx.load("tts_quantized.onnx")
model_fp16 = float16.convert_float_to_float16(model)
onnx.save(model_fp16, "tts_quantized_fp16.onnx")
But when I load fp16 model, I get error:
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/home/tupk/anaconda3/envs/dl/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 397, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from tts_quantized_fp16.onnx failed:Type Error: Type (tensor(float16)) of output arg (/RandomNormalLike_output_0) of node (/RandomNormalLike) does not match expected type (tensor(float)).
Here is my fp16 model.