coremltools
coremltools copied to clipboard
Failure to export torch.nn.functional.pad with int32 tensor
Convert does not seem to support torch pad with int32 input type
ValueError: In op, of type pad, named val, the named input
constant_valmust have the same data type as the named input
x`. However, constant_val has dtype fp32 whereas x has dtype int32.
venv/lib/python3.9/site-packages/coremltools/converters/mil/mil/input_type.py:137: ValueError`
import coremltools as ct
import torch
class Example(torch.nn.Module):
def forward(self, x):
x = torch.nn.functional.pad(x, (0, 2), value=0)
return x
module = Example().eval()
traced = torch.jit.trace(module, (torch.IntTensor([[1, 2], [3, 4]]),))
coremlmodel = ct.convert(traced,
source='pytorch',
convert_to='mlprogram',
inputs=[ct.TensorType(name="x", dtype=ct.converters.mil.mil.types.int32, shape=(2, 2)),],
outputs=[ct.TensorType(name="y", dtype=ct.converters.mil.mil.types.int32)],
compute_units=ct.ComputeUnit.ALL,
compute_precision=ct.precision.FLOAT16,
)
System environment (please complete the following information):
coremltools version: 6.2 mac os x Ventura torch: 1.13.1 python 3.9
@gsigms Thanks for filing this issue. However, I strongly won't recommend using the int32 input data type, since the most of the coreml op only accept float inputs. Please change the input type to float32 and I think that could solve the issue. Thanks
My desired use case is using pad
to populate a tensor that will be used as the reps
argument for tile
, later. The reps
parameter requires int32
. I guess I'll cast back and forth instead.