aten.alias.default operation is not supported by Coremltools converter.
🌱 Describe your Feature Request
- Please provide a description of your feature request, clear and consise.
- If this is a feature request for the Core ML Framework or Xcode, please instead submit your feature request using the Feedback Assistant for Developers: https://developer.apple.com/bug-reporting/
aten.alias.default operation is not supported by Coremltools converter.
How can this feature be used?
Please provide some examples where this feature can be used.
Trying to convert complex model from Torch to CoreML model.
Describe alternatives you've considered
NO alternative.
Additional context
Stack trace:
File "/opt/anaconda3/envs/uat/lib/python3.11/site-packages/coremltools/converters/mil/frontend/torch/converter.py", line 633, in __init__
self.graph = InternalTorchIRGraph.from_exir(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/uat/lib/python3.11/site-packages/coremltools/converters/mil/frontend/torch/internal_graph.py", line 590, in from_exir
nodes.append(InternalTorchIRNode.from_exir_node(node=node))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/uat/lib/python3.11/site-packages/coremltools/converters/mil/frontend/torch/internal_graph.py", line 289, in from_exir_node
raise ValueError(f"Unsupported fx node {str(node)}, kind {kind}")
ValueError: Unsupported fx node alias, kind alias
@himalayjor - can you please give us a model stub which use this op?
Dummy Model Code:
def forward(self, x):
y = torch.ops.aten.alias(x)
y = y + 5
return y
Is there an ETA for this ? Work is blocked because of this.
Complete code to reproduce:
import torch
import torch.nn as nn
import coremltools as ct
class Model(torch.nn.Module):
def forward(self, x):
y = torch.ops.aten.alias(x)
y = y + 5
return y
exported_model = torch.export.export(Model(), (torch.rand(2, 32),))
ct.convert(exported_model)
Any ideas @YifanShenSZ?
@himalayjor - you could edit your model to remove the alias usage. Seems like that shouldn't be difficult to do.
@himalayjor - you could edit your model to remove the
aliasusage. Seems like that shouldn't be difficult to do.
Some models exported from torch.export.export include this op and sometimes are not easily editable.
@himalayjor Does this composite op work for you?
if 'alias' not in _TORCH_OPS_REGISTRY:
@register_torch_op
def alias(context, node):
x = _get_inputs(context, node, expected=1)[0]
x_identity = mb.identity(x=x, name=node.name)
context.add(x_identity)