aimet icon indicating copy to clipboard operation
aimet copied to clipboard

Utils - ERROR -Functional ops that are not marked as math invariant were found in the model. AIMET features will not work properly for such ops.

Open SEMLLYCAT opened this issue 2 years ago • 4 comments

from aimet_torch.meta.connectedgraph import ConnectedGraph ConnectedGraph.math_invariant_types.add(...) It seems that unsupported ops can be solved by the method described in the description, but I don't know how to start. Is there any relevant tutorial or example just like this: The following functional ops were found. The parent module is named for ease of locating the ops within the model definition. constant_pad_nd4 parent module: GraphModule Add_72 parent module: GraphModule thanks!

SEMLLYCAT avatar May 04 '23 12:05 SEMLLYCAT

Hello @SEMLLYCAT Math invariant ops doesn't require quantization and can remain as functional in PyTorch model definition without affecting the outcome of AIMET features. https://github.com/quic/aimet/blob/develop/TrainingExtensions/torch/src/python/aimet_torch/meta/connectedgraph.py#L137

Now, if the functional op is math invariant, its' type can be simply added to ConnectedGraph.math_invariant_types.add('constant_pad') to avoid missing_modules error for such functional ops.

But, it seems in your case, missing modules check is triggered for Add_72 functional op as well which I believe shouldn't be triggered if prepare_model had handled it correctly and Add is not a math invariant op.

Could you please share more details on the model?

quic-hitameht avatar May 08 '23 13:05 quic-hitameht

Thank you for your reply. I have been trying to quantify the model recently. I hope you can provide valuable comments and I will forward the model to you by email

At 2023-05-08 21:06:03, "Hitarth Mehta" @.***> wrote:

Hello @SEMLLYCAT Math invariant ops doesn't require quantization and can remain as functional in PyTorch model definition without affecting the outcome of AIMET features.

Now, if the functional op is math invariant, its' type can be simply added to ConnectedGraph.math_invariant_types.add('constant_pad') to avoid missing_modules error for such functional ops.

But, it seems in your case, missing modules check is triggered for Add_72 functional op as well which I believe shouldn't be triggered if prepare_model had handled it correctly and Add is not a math invariant op.

Could you please share more details on the model?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.***>

SEMLLYCAT avatar May 25 '23 03:05 SEMLLYCAT

Is it convenient for you to provide an email address? Thank you

SEMLLYCAT avatar May 25 '23 03:05 SEMLLYCAT

I hope you can help to answer this question in your spare time, thank you

SEMLLYCAT avatar May 30 '23 08:05 SEMLLYCAT