onnx-mxnet icon indicating copy to clipboard operation
onnx-mxnet copied to clipboard

Is it possible to load an onnx model with a non-Python API?

Open newling opened this issue 6 years ago • 5 comments

In particular I'd like to load an .onnx file in C++. Thanks.

newling avatar Mar 13 '18 12:03 newling

@newling Currently we do not have a C++ API to import onnx models to mxnet.

rajanksin avatar Mar 13 '18 20:03 rajanksin

@newling can you describe your use case a bit? we may be able to help.

lupesko avatar Mar 15 '18 06:03 lupesko

I would like to load onnx models, and run them (on CPU). Mostly for testing: I would like to compare the outputs obtained with my (C++) implementations. Not urgent.

newling avatar Mar 15 '18 22:03 newling

Does anyone know how to do that?

jiankang1991 avatar Nov 21 '18 13:11 jiankang1991

In MXNet, at least, ONNX load is supported only via Python API. Not sure about other frameworks. However, you can write a simple script that exports the onnx model to MXNet model, and then you can load the MXNet model with the various MXNet language APIs such as Scala or C++, and there is a Java API coming soon. You can also use MXNet Model Server that supports ONNX out of the box (no conversions needed).

lupesko avatar Nov 22 '18 06:11 lupesko