tutorials
tutorials copied to clipboard
TypeError: 'ModelProto' object is not callable
I am currently working on model conversion and deployment.When I get the onnx model, I want to use the onnx model for inference.I know that, I can use onnxruntime for inference. But when I use onnx load model and inference, it occurs error. The following is the code and error message: model_onnx = onnx.load(model_file_name + '.onnx') onnx.checker.check_model(model_onnx) output2 = model_onnx(dummy_input)
Traceback (most recent call last):
File "/Users/xxx/PycharmProjects/c3_sinet/main_convert_postopmodel_to_onnx.py", line 27, in
so, I'm wondering if I can use the onnx framework load model and do inference.
To do inference, you need an inference engine, like onnxruntime (https://github.com/microsoft/onnxruntime)
I can understand that onnx can only be used as an intermediate expression for model conversion, and cannot be inferred?@ wenbingl