executorch icon indicating copy to clipboard operation
executorch copied to clipboard

Operator torch._ops.aten.linalg_vector_norm.default is not Aten Canonical

Open nbansal90 opened this issue 1 year ago • 2 comments

I am trying to create an Executorch Program using the steps mentioned at Setting Up Executorch. I made the changes successfully according to my model… But as I am executing the export script… I am getting this error:

torch._export.verifier.SpecViolationError: Operator torch._ops.aten.linalg_vector_norm.default is not Aten Canonical. Looking back at my model, the only operator that might have caused this issue seems to be: q = torch.nn.functional.normalize(q, dim=-1), But I am not sure how exactly I deal with this error and am stuck at this step.

What could be a probable workaround for this case? Any suggestions/help is appreciated!

Regards!

nbansal90 avatar May 09 '24 21:05 nbansal90

It's supposed to an aten core op. Probably this pull request can help https://github.com/pytorch/pytorch/pull/125789 - we're trying to land it

cccclai avatar May 10 '24 01:05 cccclai

@cccclai that's great to know! Would appreciate it if this update could be pushed in as this is something which blocking my efforts to benchmark my model on an edge device.

Thank you for your prompt response.

nbansal90 avatar May 10 '24 17:05 nbansal90

Before the PR is merged, workaround is calling these following ops manually. The same idea will be applied to https://github.com/pytorch/pytorch/pull/125789

def decomp_linalg_vector_norm(a, order):
    # Compute the absolute values of the elements of 'a'
    abs_a = torch.abs(a)

    # Compute the sum of the absolute values raised to the power of order
    sum_p = torch.sum(torch.pow(abs_a, ord=order))

    # Compute the order-th root of the sum
    norm_value = torch.pow(sum_p, 1/order)
    return norm_value

cccclai avatar Jun 03 '24 16:06 cccclai

Sure! I will give it a try.

nbansal90 avatar Jun 06 '24 21:06 nbansal90

def decomp_linalg_vector_norm(a, order=2): abs_a = torch.abs(a) sum_p = torch.sum(torch.pow(abs_a, ord=order)) norm_value = torch.pow(sum_p, 1/order) out = torch.div(sum_p/norm_value) return out

Resolves the issue for me. Since, there is already a PR looking in to adding this as part of core aten operator. closing this.

nbansal90 avatar Jun 11 '24 18:06 nbansal90