Unable to map torch_upsample_nearest_neighbor to core upsample, using flexible input shapes during conversion
🐞Describing the bug
I get the error Unable to map torch_upsample_nearest_neighbor to core upsample when I try to convert the DETR PyTorch model. I tried to go deep into the package to see that the issue is arising from the _try_get_upsample_factor function where the op.op_type is gather but the conditional checks for cast.
Stack Trace
Traceback (most recent call last):
File "test.py", line 19, in
Python code snippet
from transformers import DetrFeatureExtractor, DetrForObjectDetection
import torch
from PIL import Image
import requests
import coremltools as ct
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
image_processor = DetrFeatureExtractor.from_pretrained("facebook/detr-resnet-50")
model = DetrForObjectDetection.from_pretrained("facebook/detr-resnet-50", return_dict=False)
inputs = image_processor(images=image, return_tensors="pt")
outputs = model(**inputs)
traced_model = torch.jit.trace(model, example_inputs=inputs["pixel_values"])
model = ct.convert(
traced_model,
convert_to="mlprogram",
inputs=[ct.ImageType(shape=(1, 3, ct.RangeDim(256, 3072), ct.RangeDim(256, 3072)))]
)
System environment (please complete the following information):
- coremltools version: 6.1
- OS: Ubuntu 20.04.4 LTS
- PyTorch version: 1.10.0
- transformers version: 4.19.3
I can reproduce this issue.
This works if you don't use flexible shaped input, i.e. this works:
model = ct.convert(
traced_model,
convert_to="mlprogram",
inputs=[ct.ImageType(shape=inputs["pixel_values"].shape)]
)
When flexible shapes are used both scales_h and scales_w are getting set to None because their op type gather.
Resizing inputs to standard size gives bad results for some reason. Even with padding. Is there any other way to keep the flexible sizing?
Hi ,have same solutions yet? I want to convert Detrtransfomer , the same question occured.
the same question occured. Is there any way to use flexible shape?
/coreML/lib/python3.11/site-packages/coremltools/converters/mil/frontend/torch/ssa_passes/torch_upsample_to_core_upsample.py", line 47, in _torch_upsample_to_core_upsample_block raise ValueError("Unable to map {} to core upsample".format(op.op_type)) ValueError: Unable to map torch_upsample_bilinear to core upsample
Input target_size_height must be const at compile time', 'target_size_height', 'gather_0')