onnx-mxnet
onnx-mxnet copied to clipboard
MXNetError: Invalid Parameter format for shape expect Shape(tuple) but value='<Symbol identity1>', in operator Reshape(name="", shape="<Symbol identity1>")
I use the example from https://github.com/onnx/tutorials/blob/master/tutorials/PytorchCaffe2SuperResolution.ipynb to export super_resolution.onnx and run in MXNET , this error occurs I did run programming with GPU on Jupyter notebook
Error Message(Import model with MXNET)
MXNetErrorTraceback (most recent call last)
<ipython-input-2-330c0ff72e5d> in <module>()
1 import onnx_mxnet
2 from IPython.core.display import display
----> 3 sym, params = onnx_mxnet.import_model('super_resolution.onnx')
/usr/local/lib/python3.6/dist-packages/onnx_mxnet/__init__.py in import_model(model_file)
33 # loads model file and returns ONNX protobuf object
34 model_proto = onnx.load(model_file)
---> 35 sym, params = graph.from_onnx(model_proto.graph)
36 return sym, params
/usr/local/lib/python3.6/dist-packages/onnx_mxnet/import_onnx.py in from_onnx(self, graph)
139 op = self._fix_squeeze(inputs, mx_attr)
140 else:
--> 141 op = new_op(name=node_name, *inputs, **mx_attr)
142
143 node_output = self._fix_outputs(op_name, node.output)
/usr/local/lib/python3.6/dist-packages/mxnet/symbol/register.py in reshape(data, shape, reverse, target_shape, keep_highest, name, attr, out, **kwargs)
/usr/local/lib/python3.6/dist-packages/mxnet/_ctypes/symbol.py in _symbol_creator(handle, args, kwargs, keys, vals, name)
123 c_str_array(keys),
124 c_str_array([str(v) for v in vals]),
--> 125 ctypes.byref(sym_handle)))
126
127 if args and kwargs:
/usr/local/lib/python3.6/dist-packages/mxnet/base.py in check_call(ret)
144 """
145 if ret != 0:
--> 146 raise MXNetError(py_str(_LIB.MXGetLastError()))
147
148
MXNetError: Invalid Parameter format for shape expect Shape(tuple) but value='<Symbol identity1>', in operator Reshape(name="", shape="<Symbol identity1>")
this is my export model code
import io
import numpy as np
from torch import nn
from torch.autograd import Variable
import torch.utils.model_zoo as model_zoo
import torch.onnx
import torch.nn as nn
import torch.nn.init as init
class SuperResolutionNet(nn.Module):
def __init__(self, upscale_factor, inplace=False):
super(SuperResolutionNet, self).__init__()
self.relu = nn.ReLU(inplace=inplace)
self.conv1 = nn.Conv2d(1, 64, (5, 5), (1, 1), (2, 2))
self.conv2 = nn.Conv2d(64, 64, (3, 3), (1, 1), (1, 1))
self.conv3 = nn.Conv2d(64, 32, (3, 3), (1, 1), (1, 1))
self.conv4 = nn.Conv2d(32, upscale_factor ** 2, (3, 3), (1, 1), (1, 1))
self.pixel_shuffle = nn.PixelShuffle(upscale_factor)
self._initialize_weights()
def forward(self, x):
x = self.relu(self.conv1(x))
x = self.relu(self.conv2(x))
x = self.relu(self.conv3(x))
x = self.pixel_shuffle(self.conv4(x))
return x
def _initialize_weights(self):
init.orthogonal(self.conv1.weight, init.calculate_gain('relu'))
init.orthogonal(self.conv2.weight, init.calculate_gain('relu'))
init.orthogonal(self.conv3.weight, init.calculate_gain('relu'))
init.orthogonal(self.conv4.weight)
torch_model = SuperResolutionNet(upscale_factor=3)
model_url = 'https://s3.amazonaws.com/pytorch/test_data/export/superres_epoch100-44c6958e.pth'
batch_size = 1 # just a random number
torch_model.load_state_dict(model_zoo.load_url(model_url))
torch_model.train(False)
x = Variable(torch.randn(batch_size, 1, 224, 224), requires_grad=True)
torch_out = torch.onnx._export(torch_model , x , "super_resolution.onnx" , export_params=True)
this is my import model code
import onnx_mxnet
from IPython.core.display import display
sym, params = onnx_mxnet.import_model('super_resolution.onnx')
my Python Environment
python 3.6.3 apt-get install protobuf-compiler libprotoc-dev pip install git+https://github.com/onnx/onnx.git@master and I try to do pip install onnx-mxnet pip install Pillow and git clone https://github.com/onnx/onnx-mxnet.git cd onnx-mxnet sudo python setup.py install but both of two methods occurred the same error
Could you please advice how to solve out this error Thank you
@spidyDev to take a look
ONNX changed the definition for Reshape op after in the latest master. Pytorch has already incorporated the change in pytorch/caffe export, hence the problem. @anirudhacharya is already working on a solution as MXNET reshape operator doesnt match with new ONNX definition. There is a related issue raised on Mxnet github: https://github.com/apache/incubator-mxnet/issues/10789
@spidyDev Is it possible that I roll back onnx to a old version to bypass this problem
this is from onnx ONNX_OPERATOR_SET_SCHEMA( Reshape, 5, OpSchema() .SetDoc(Reshape_ver5_doc) .Input(0, "data", "An input tensor.", "T") .Input(1, "shape", "Specified shape for output.", "tensor(int64)") //where is the problem? .Output(0, "reshaped", "Reshaped data.", "T")
@glingyan The issue should be resolved in latest MXNet (please install using : pip install mxnet --pre)
The problem was "shape" input was previously an "attribute" in reshape operator(https://github.com/onnx/onnx/blob/master/docs/Changelog.md#Reshape-1). Also, MXNet always expects the "shape" as an attribute not a input symbol. This scenario has been handle in the latest MXNet now (master).
Rolling back ONNX woudn't help as pytorch converted ONNX model will have "shape" as input, so needs handling in import code to be fixed. Please try with the latest and let us know if the issue persists.
@spidyDev I already use latest mxnet, the Branches: master, remotes/origin/marcoabreu-patch-1, remotes/origin/master Follows: utils Precedes:
minor fixes to example/ssd (#11138)
this is the error I got
File "/home/lingyan/python3_env/pytorch3/lib/python3.5/site-packages/mxnet-1.3.0-py3.5.egg/mxnet/contrib/onnx/_import/import_onnx.py", line 114, in from_onnx
mxnet_sym = self._convert_operator(node_name, op_name, onnx_attr, inputs)
File "/home/lingyan/python3_env/pytorch3/lib/python3.5/site-packages/mxnet-1.3.0-py3.5.egg/mxnet/contrib/onnx/_import/import_onnx.py", line 66, in _convert_operator
mxnet_sym = new_op(*inputs, **new_attrs)
File "
this is the onnx out %311 : Float(512) = onnx::Normaxis=1, keepdims=0, ord=2, scope: eNet %312 : Dynamic = onnx::Constantvalue= 512 1 1 [ CPULongTensor{3} ], scope: eNet %313 : Float(512, 1, 1) = onnx::Reshape(%311, %312), scope: eNet
sorry forgot to mention, as this repo is deprecated , all our fixes goes in the import module inside mxnet. please use http://mxnet.incubator.apache.org/versions/1.2.0/api/python/contrib/onnx.html for reference and usage.
@spidyDev thanks, I already use latest mxnet
@glingyan Are you using the import module we added inside latest MXNet. i.e. import mxnet.contrib.onnx as onnx_mxnet
yes, my code is simple
"""Testing super_resolution model conversion""" from future import absolute_import as _abs from future import print_function from collections import namedtuple import logging import numpy as np
import mxnet as mx from mxnet.test_utils import download import mxnet.contrib.onnx as onnx_mxnet
set up logger
logging.basicConfig() LOGGER = logging.getLogger() LOGGER.setLevel(logging.INFO)
def import_onnx(): LOGGER.info("Converting onnx format to mxnet's symbol and params...") sym, arg_params, aux_params = onnx_mxnet.import_model('enet.onnx') LOGGER.info("Successfully Converted onnx format to mxnet's symbol and params...") return sym, arg_params, aux_params
if name == 'main': import_onnx()
could you tell me which patch in mxnet is to fix this , I will add print to double check
the reshape operator convertor (https://github.com/apache/incubator-mxnet/blob/master/python/mxnet/contrib/onnx/_import/op_translations.py#L346) was expected to fix this, but we need to look what is the issue.
just checked my code , seems do not have this patch , will update and report to you
still have problem inputs = {list} <class 'list'>: [<Symbol norm0>, <Symbol identity0>] 0 = {Symbol} <Symbol norm0> handle = {c_void_p} c_void_p(220823152) name = {str} 'norm0' 1 = {Symbol} <Symbol identity0> handle = {c_void_p} c_void_p(45194928) name = {str} 'identity0' len = {int} 2
reshape_shape = list(proto_obj._params[inputs[1].name].asnumpy()) inputs[1].name is identity0, it should be a number
Thanks @glingyan for the details. We are looking into the issue.
Had the problem already been fixed ?, anyone can tell me how to solve it, Thank you
@lxy5513 this repo is deprecated, functionality is now built into mxnet. Please track the issue in MXNet.