onnx-coreml
onnx-coreml copied to clipboard
Using PixelShuffle operation causes error of computing minimum sequence length
🐞Describe the bug
As part of my model, I take an input, reshape it, then I perform PixelShuffle and multiply its result by the input. When I try to convert the ONNX model to CoreML I get the following error:
RuntimeWarning: You will not be able to run predict() on this Core ML model. Underlying exception message was: Error compiling model: "compiler error: Error in neural network compiler computing minimum sequence length for the model.". RuntimeWarning)
To make the things even weirder, If I omit the final multiplication of the input, the conversion is successful with no errors.
Trace
graph(%input : Float(1, 1, 28, 28)):
%1 : Tensor = onnx::Constant[value= 1 4 14 14 [ CPULongType{4} ]]()
%2 : Float(1, 4, 14, 14) = onnx::Reshape(%input, %1) # /Users/oavrahami/Dev/research-deep-matting/model_serialization/reproduce_serialization_bug.py:25:0
%3 : Tensor = onnx::Constant[value= -1 1 2 2 14 14 [ CPULongType{6} ]]()
%4 : Tensor = onnx::Reshape(%2, %3)
%5 : Tensor = onnx::Transpose[perm=[0, 1, 4, 2, 5, 3]](%4)
%6 : Tensor = onnx::Constant[value= -1 1 28 28 [ CPULongType{4} ]]()
%7 : Float(1, 1, 28, 28) = onnx::Reshape(%5, %6) # /Users/oavrahami/Dev/research-deep-matting/model_serialization/reproduce_serialization_bug.py:26:0
%output : Float(1, 1, 28, 28) = onnx::Mul(%input, %7) # /Users/oavrahami/Dev/research-deep-matting/model_serialization/reproduce_serialization_bug.py:29:0
return (%output)
1/7: Converting Node Type Reshape
2/7: Converting Node Type Reshape
3/7: Converting Node Type Transpose
4/7: Converting Node Type Reshape
5/7: Converting Node Type Transpose
6/7: Converting Node Type Reshape
7/7: Converting Node Type Mul
Translation to CoreML spec completed. Now compiling the CoreML model.
Model Compilation done.
/Users/user/project/venv/lib/python3.7/site-packages/coremltools/models/model.py:111: RuntimeWarning: You will not be able to run predict() on this Core ML model. Underlying exception message was: Error compiling model: "compiler error: Error in neural network compiler computing minimum sequence length for the model.".
RuntimeWarning)
To Reproduce
I have created a small dummy network to show the bug:
import os
import torch
import torch.nn as nn
import torch.nn.functional as F
import onnx
import onnx_coreml
class MyModel(nn.Module):
def __init__(self):
super().__init__()
def forward(self, input):
"""
:param input: Input of shape [1, 1, 28, 28]
"""
x = input.view(1, 1 * 4, 28 // 2, 28 // 2)
x = F.pixel_shuffle(x, upscale_factor=2)
# If uncomment - no error
# return x
# Cause the 'minimum sequence length' exception
return input * x
if __name__ == '__main__':
os.makedirs('output', exist_ok=True)
# Pytorch to ONNX
pytorch_model = MyModel()
dummy_input = torch.rand(size=[1, 1, 28, 28])
torch.onnx.export(model=pytorch_model,
args=dummy_input,
f='output/bug_model.onnx',
verbose=True,
input_names=['input'],
output_names=['output'],
opset_version=9)
# ONNX to CoreML
onnx_model = onnx.load_model('output/bug_model.onnx')
mlmodel = onnx_coreml.convert(onnx_model)
mlmodel.save('output/bug_model.mlmodel')
System environment (please complete the following information):
- coremltools version: 3.3
- onnx-coreml version: 1.2
- OS: MacOS
- macOS version: 10.14.6
- How you install python: virtualenv
- python version: 3.4.7
- torch version: 1.4.0