tutorials icon indicating copy to clipboard operation
tutorials copied to clipboard

LayerNorm Op missing?

Open nabsabraham opened this issue 3 years ago • 3 comments

Ask a Question

Question

I have a bert-base model trained with some linear layers on top and adapter layers in the backbone. I export the model with onnx like so:

torch.onnx.export(
    model,
    (ids, mask),
    "model.onnx",
    opset_version=10,
    input_names=["ids", "mask"],
    output_names=["output"],
    export_params=True,
    dynamic_axes={
        "ids": {0: "batch_size"},
        "mask": {0: "batch_size"},
        "output": {0: "batch_size"},
    },
)

However, when I try to run an inference session, I see this warning/error pop up:

Execution will fail if ORT does not have a specialized kernel for this op
2022-06-01 03:27:22.058423804 [W:onnxruntime:, graph.cc:2676 InitFunctionBodyForNode] Function body initialization failed for node 'LayerNormalization_token_28' optype LayerNormalization. Error message /onnxruntime_src/onnxruntime/core/graph/function.cc:788 onnxruntime::FunctionImpl::FunctionImpl(onnxruntime::Graph&, const NodeIndex&, const onnx::FunctionProto&, const std::unordered_map<std::basic_string<char>, const onnx::FunctionProto*>&, std::vector<std::unique_ptr<onnxruntime::Function> >&, const onnxruntime::logging::Logger&, bool) status.IsOK() was false. Resolve subgraph failed:Node (0x5a5b2e0) Op (Flatten) [ShapeInferenceError] Invalid value(-1) for attribute 'axis'

Can someone suggest a custom op solution to this? Do I need a solution to this? My understanding is in the absence of the operator, values will be replaced with constants - what are the implications of this? I can run the a sample through this model but i'm worried about the warnings leading to a long-term issue.

nabsabraham avatar Jun 01 '22 04:06 nabsabraham

LayerNormalization will be included in official ONNX 1.12: https://github.com/onnx/onnx/pull/4076. Even for now, IIUC, LayerNormalization is an existing contr_ops in ONNX Runtime so the model has this LayerNorm op should be runnable. Does this warning/error block your inference? If so, I would suggest you raise this issue in ONNX Runtime repo to let runtime experts take a closer look.

jcwchen avatar Jun 09 '22 20:06 jcwchen

Hey @nabsabraham, were you able to solve the issue?

sanjay23singh avatar Sep 29 '22 12:09 sanjay23singh

I think at least the latest torch-nightly should have covered LayerNorm conversion: https://github.com/pytorch/pytorch/pull/84293.

jcwchen avatar Sep 29 '22 13:09 jcwchen