SimplifiedLayerNormalization is unimplemented
SimplifiedLayerNormalization which is relatively common in transformer models is unimplemented. To reproduce run tract on the uploaded model.
You should get the following error
┣┻ 213 MatMulInference /lm_head/MatMul
━━━ ..,F32
[2025-07-03T21:52:27.457619892Z ERROR tract] Error at stage "analyse"
Caused by:
0: ModelBuildingError
1: #52 "/h.0/ln_1/Mul/SimplifiedLayerNormFusion/" Unimplemented(SimplifiedLayerNormalization) has incomplete typing
Hey, I can't find too much information about this operator. I can't find it in ONNX. I assume it is a microsoft / onnx-runtime extension ? Do you know if we have a spec and ideally some unit tests somewher ?
As best I can tell here's some tests from the onnxruntime repo:
https://github.com/microsoft/onnxruntime/blob/2f878c60296de169a8a523e692d3d65893f7c133/onnxruntime/test/python/transformers/test_simplified_layernorm_fusion.py#L34
and the only spec I can find is the doc string here:
https://github.com/microsoft/onnxruntime/blob/2f878c60296de169a8a523e692d3d65893f7c133/onnxruntime/core/optimizer/layer_norm_fusion.cc#L518
Do you think its worth not even fusing but instead representing it as the individual ops:
X --> Pow --> ReduceMean --> Add --> Sqrt --> Div --> Mul
| ^
| |
+----------------------------------------------+
Maybe consider using https://onnx.ai/onnx/operators/onnx__RMSNormalization.html, which is a standard onnx operator?