coremltools icon indicating copy to clipboard operation
coremltools copied to clipboard

PyTorch Tensor Slicing Assignment Bug

Open lytcherino opened this issue 2 years ago • 3 comments

Hi,

I have 2 bugs related to slicing.

  1. In the forward method of a torch module, that uses rank 4 tensors, attempting to slice and assign to a number.
  2. In the forward method of a torch module, that uses rank 4 tensors, attempting to slice and assign to a new tensor.

Trace

Traceback (most recent call last):
  File "/home/guests/<user>/", line 22, in <module>
    mlmodel = coremltools.converters.convert(
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/_converters_entry.py", line 326, in convert
    mlmodel = mil_convert(
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/converter.py", line 182, in mil_convert
    return _mil_convert(model, convert_from, convert_to, ConverterRegistry, MLModel, compute_units, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/converter.py", line 209, in _mil_convert
    proto, mil_program = mil_convert_to_proto(
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/converter.py", line 300, in mil_convert_to_proto
    prog = frontend_converter(model, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/converter.py", line 104, in __call__
    return load(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/load.py", line 50, in load
    return _perform_torch_convert(converter, debug)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/load.py", line 87, in _perform_torch_convert
    prog = converter.convert()
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/converter.py", line 239, in convert
    convert_nodes(self.context, self.graph)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/ops.py", line 76, in convert_nodes
    add_op(context, node)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/ops.py", line 2689, in _internal_tensor_value_assign
    updated_x = mb.torch_tensor_assign(
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/mil/ops/registry.py", line 63, in add_op
    return cls._add_op(op_cls, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/mil/builder.py", line 191, in _add_op
    new_op.type_value_inference()
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/mil/operation.py", line 240, in type_value_inference
    output_types = self.type_inference()
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/dialect_ops.py", line 220, in type_inference
    raise ValueError("The updates tensor should have shape {}. Got {}".format(expected_updates_shape, self.updates.shape))
ValueError: The updates tensor should have shape (1, 12, 256, 256). Got (1, 12, 128, 256)
Traceback (most recent call last):
  File "/home/guests/<user>", line 25, in <module>
    mlmodel = coremltools.converters.convert(
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/_converters_entry.py", line 326, in convert
    mlmodel = mil_convert(
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/converter.py", line 182, in mil_convert
    return _mil_convert(model, convert_from, convert_to, ConverterRegistry, MLModel, compute_units, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/converter.py", line 209, in _mil_convert
    proto, mil_program = mil_convert_to_proto(
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/converter.py", line 300, in mil_convert_to_proto
    prog = frontend_converter(model, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/converter.py", line 104, in __call__
    return load(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/load.py", line 50, in load
    return _perform_torch_convert(converter, debug)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/load.py", line 87, in _perform_torch_convert
    prog = converter.convert()
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/converter.py", line 239, in convert
    convert_nodes(self.context, self.graph)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/ops.py", line 76, in convert_nodes
    add_op(context, node)
  File "/usr/local/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/ops.py", line 3441, in zeros
    dtype = inputs[1].val
AttributeError: 'NoneType' object has no attribute 'val'

To Reproduce

import torch
import coremltools

class Model(torch.nn.Module):

    def __init__(self):
        super().__init__()

    def forward(self, input):
        input[:,:,0::2,:] = 1
        input[:,:,1::2,:] = 2

        return input

if __name__ == "__main__":

    model = Model()
    input = torch.randn((1,12,256,256))

    torchscript_model = torch.jit.script(model)

    mlmodel = coremltools.converters.convert(
        torchscript_model,
        inputs=[coremltools.TensorType(name=f'input_0', shape=input.shape)],
        minimum_deployment_target=coremltools.target.iOS14,
    )
import torch
import coremltools

class Model(torch.nn.Module):

   def __init__(self):
       super().__init__()

   def forward(self, input):

       b, c, h, w = input.shape
       xl = torch.zeros((b,c,h//2,w))

       input[:,:,0::2,:] = xl

       return input

if __name__ == "__main__":

   model = Model()
   input = torch.randn((1,12,256,256))

   torchscript_model = torch.jit.script(model)

   mlmodel = coremltools.converters.convert(
       torchscript_model,
       inputs=[coremltools.TensorType(name=f'input_0', shape=input.shape)],
       minimum_deployment_target=coremltools.target.iOS14,
   )

System environment:

  • coremltools==5.1.0
  • torch==1.11
  • OS: Linux
  • How you install python: from source
  • python version: 3.9.7

lytcherino avatar Mar 10 '22 15:03 lytcherino

Using your code I can reproduce both issues using coremltools 5.2.

TobyRoseman avatar Mar 11 '22 21:03 TobyRoseman

I encounter the same issue using coremltools 5.2.

JierunChen avatar Jun 12 '22 13:06 JierunChen

I encounter the same issue using coremltools 5.2.

I encountered the same problem when deploying yolox-tiny. I tried to convert torch model to onnx, and then onnx to coreml model, The conversion was successful, but it doesn't seem to work in Xcode. Do you know what caused the bug

noobpeng99 avatar Jun 13 '22 09:06 noobpeng99