onnx-coreml
onnx-coreml copied to clipboard
Error converting mosaic.onnx to coreML- upsample layer / scales
This is my first ever gitHub post, so apologies if there are any inaccuracies in the post.
Hi i am . following to tutorials to train a model for fast neural style transfer using pytorch and converting it to coreML
I have got as fare as making the mosaic.onnx file but keep getting this error after i have followed these official iinstructions
pip install coremltools==3.0b3 pip install onnx-coreml==1.0b2 from onnx_coreml import convert ml_model = convert(model='mosaic.onnx', disable_coreml_rank5_mapping=True)
i get the following error - half way through the conversion
63/94: Converting Node Type Concat
64/94: Converting Node Type Upsample
Traceback (most recent call last):
File "
I have tried the official mosaic.onnx file found on the onnx model repo, but still same issue. I have tried another mobilenetv2-1.0.onnx file to convert - and this completes fine.
Is this an issue with style transfer models?
System Information
I used pytorch 1.1.0 to make the model
Please help! Thanks
@saddif this is known issue raising due to PyTorch -> ONNX conversion leading to pattern which does not optimizes scales for UpSample to be known and CoreML at this point, does not support dynamic scale for UpSample layer.
But, we have a work-around by specifying the scales manually. Please follows instructions as mentioned in this comment https://github.com/onnx/onnx-coreml/issues/453#issuecomment-525540974
I can not export from PyTorch to ONNX using the upsample operator with torch version 1.3.1, onnx version 1.5.0.
MINIMUM CODE TO REPRODUCE:
import torch
import torch.nn as nn
import torch.nn.functional as F
import onnx
print(torch.__version__)
print(onnx.__version__)
class TestModel(nn.Module):
def __init__(self):
super(TestModel, self).__init__()
def forward(self, x):
x = F.interpolate(x, scale_factor=2, mode='nearest')
return x
torch_model = TestModel()
dummy_input = torch.randn(1, 3, 256, 256)
torch_out = torch.onnx.export(torch_model, dummy_input, 'model.onnx', verbose=True, opset_version=11)
onnx_model = onnx.load('model.onnx')
print(onnx_model)
onnx.checker.check_model(onnx_model)
Produces the following error:
Traceback (most recent call last):
File "/Users/glennjocher/PycharmProjects/iD/upsample_error_reproduce.py", line 26, in <module>
onnx.checker.check_model(onnx_model)
File "/Users/glennjocher/.conda/envs/yolov3/lib/python3.7/site-packages/onnx/checker.py", line 86, in check_model
C.check_model(model.SerializeToString())
onnx.onnx_cpp2py_export.checker.ValidationError: Node () has input size 4 not in range [min=2, max=2].
==> Context: Bad node spec: input: "input" input: "23" input: "23" input: "22" output: "24" op_type: "Resize" attribute { name: "coordinate_transformation_mode" s: "asymmetric" type: STRING } attribute { name: "cubic_coeff_a" f: -0.75 type: FLOAT } attribute { name: "mode" s: "nearest" type: STRING } attribute { name: "nearest_mode" s: "floor" type: STRING }
@saddif I'm not sure about your case but I was able to overcome this error by putting the model through https://github.com/daquexian/onnx-simplifier Hope it helps