Toby Roseman
Toby Roseman
This is fixed by #1653. The [example code](https://github.com/apple/coremltools/issues/1359#issuecomment-1155123776) now runs without error.
Our support for Torch Script Models is experimental. If at all possible [trace](https://pytorch.org/docs/stable/generated/torch.jit.trace.html) your model prior to conversion.
@JRGit4UE - Thanks for the feedback. I agree it would be good to make this more clear. Users do get [a warning when they try to convert a Torch Script...
This is a duplicate of #1504. We have an open pull request (#1657). Closing this issue as a duplicate.
Looks like this is fixed by #1653. The [minimal example](https://github.com/apple/coremltools/issues/1537#issuecomment-1179557125) now runs without error.
Since we have not received steps to reproduce this problem, I'm going to close this issue. If we get steps to reproduce the problem, I will reopen the issue.
As a temporary workaround, you could disable the problematic `reduce_transposes` optimization/pass. Delete or comment out, [this line](https://github.com/apple/coremltools/blob/20b83527ec7c060cde1251c8a5bd73c6d6ac8619/coremltools/converters/mil/mil/passes/apply_common_pass_pipeline.py#L60) and [this line](https://github.com/apple/coremltools/blob/20b83527ec7c060cde1251c8a5bd73c6d6ac8619/coremltools/converters/mil/mil/passes/apply_common_pass_pipeline.py#L97).
A fix for this bug was included in our [6.1 release](https://github.com/apple/coremltools/releases/tag/6.1). With coremltools 6.1 the model now converts without error.
@kir486680 - which of the three models you shared is generating this error? How are you calling `coremltools.convert`?
The reproduce code contains an implementation of `upsample_bilinear2d`. Coremltools 6.0 now supports this layer type. Our implementation looks quite a bit different than yours. @Siq1982 - Does the model work...