wonnx icon indicating copy to clipboard operation
wonnx copied to clipboard

Support for ai.onnx.ml opset

Open redseal798 opened this issue 2 years ago • 12 comments

Is your feature request related to a problem? Please describe.

Support ONNX ml operators: https://github.com/onnx/onnx/blob/main/docs/Operators-ml.md

Describe the solution you'd like

When using ONNX ML Operators it should work, such as using ONNX models created from sklearn pipelines

Describe alternatives you've considered

NA

Additional context

redseal798 avatar Apr 02 '23 09:04 redseal798

While I think we encourage anyone interested to work on this (and I'd be happy to review PRs), this is not currently in scope unfortunately as it is quite a bit of work.

I'm not sure our architecture fits this either. The ML operators appear to be quite a bit more high-level than the ONNX operators we currently implement, and it may not be feasible or practical to implement them in the current architecture (where operators are 1:1 translated to a GPU shader, which are executed in a predefined order with no room for CPU work in between).

pixelspark avatar Apr 02 '23 18:04 pixelspark

Thanks for the feedback!

redseal798 avatar Apr 08 '23 02:04 redseal798

FWIW tf2onnx seems to add an ai.onnx.ml opset import no matter whether those operators are actually used by the network, I've had to manually patch the generated networks to remove the import to get them to work. It's possible that other frameworks have the same problem.

SludgePhD avatar Jun 04 '23 13:06 SludgePhD

I've had to manually patch the generated networks to remove the import to get them to work.

@SludgePhD what method did you use to do this?

schell avatar Jul 12 '23 22:07 schell

@schell There is no good tooling for this (in fact there is no official tooling for manipulating ONNX files at all as far as I can tell), so I had to write a Python script:

original_model = onnx.load(model_path)

if len(original_model.opset_import) == 2:
    print("opset imports before: ", original_model.opset_import)
    original_model.opset_import.pop()
    print("opset imports after: ", original_model.opset_import)

(it's not very robust)

SludgePhD avatar Jul 14 '23 18:07 SludgePhD

Thank you none the less, @SludgePhD :)

schell avatar Jul 16 '23 21:07 schell

I am not sure from the above if wonnx throws an error when just the import is present? If so we can actually make wonnx a bit more lenient and let it ignore the import as long as no ops from that set are used (unknown ops lead to an error anyway even from the 'basic' set of ops).

pixelspark avatar Jul 18 '23 17:07 pixelspark

That would be nice @pixelspark :)

schell avatar Jul 21 '23 23:07 schell

@pixelspark how would one figure out if any ai.onnx.ml ops are used in the model? I'm currently writing a cli tool that would do this and remove the opset if possible.

schell avatar Sep 26 '23 21:09 schell

@pixelspark sorry, nevermind - I figured it out between the wonnx source and protobuf :) 👍

schell avatar Sep 26 '23 21:09 schell

I'm currently writing a cli tool that would do this and remove the opset if possible.

Did that end up anywhere?

Mek101 avatar Dec 11 '23 16:12 Mek101

Sorry @Mek101 - no, as it's part of my day job and was just a quick spike I didn't push the code up anywhere. But it did end up "working". The cli tool traverses the graph looking for uses of the opset and if none are found it removes it and saves the graph.

schell avatar Dec 12 '23 07:12 schell