candle-onnx: Implement layer normalization operator
Added Layer Normalization operator with tests. Related issue: https://github.com/huggingface/candle/issues/2849
Why not using LayerNorm from candle-nn ?
Or that is a different thing ? (I'm not that familiar with ml things)
Why not using LayerNorm from candle-nn ?
Or that is a different thing ? (I'm not that familiar with ml things)
Thank you for your comment. Honestly I didn't see the implementation in candle-nn, maybe it can be used in this case. However I see difference in ONNX version of LayerNorm, that is additional axis parameter. I will mark this PR as draft until i clear this out.
I have changed the implementation to use built-in candle-nn layer normalization. All tests are passing so I think everything should be alright with this approach.
Curious if there are any updates here; I just ran into this issue when trying to integrate candle-onnx into my application.