tflite2onnx
tflite2onnx copied to clipboard
Implementing compound operators: HardSwish
HardSwish does not have a direct equivalent in ONNX. I intend to implement it as a hardsigmoid followed by mul. My question is with the current architecture how do you define an operator that doesn't map to a built-in ONNX operator? What will @property type(self) of the operator definition return?
Hi @ashikns that is a good question, I have encountered similar issues before. In general, we don't have a generic solution (I mean a design level support) to enable one TFLite operator with multiple ONNX operators. Instead, we need to manually, which in practice includes (what we are doing now doesn't necessary to be what we will do in the future):
- Choose a core ONNX operator among the ones (we may call them a subgraph) that implement the TFLite semantic, build operator converter class
C
. - The ONNX operators in the subgraph prior to the core operator go to
C.pre
, others go toC.post
. This can all be done inC.parse()
. - When building the subgraph we need dedicated operator converter class instances (which go to
C.pre
andC.post
), but we build them manually rather than parse from TFLite semantic.
That shall be done. fakeBroadcast
of Binary
and handleFusedActivation
could be examples that you may be interested in.
Off the topic, we have not taken this issue very seriously yet as tflite2onnx
has just been open-sourced. But eventually, we need to get it worked decently. Before decently, maybe we can try out different solutions to see where the practice will lead us to. For example, we have not tried RNN or control-flow operators yet, I am not sure if we need to introduce a subgraph concept. We can try out any ideas - any that can make a test pass - and then decide.
Btw, I removed the help wanted
label. It seems to me that means the issue needs help from the community, e.g. we need more effort to write more operators. Please help to correct it if I get it wrong.
Open sourcing of this library came at the right time I'd say. There are a lot of trained models being released into the wild. I have already had success using this library to convert a variation of the mobilenetv3 with some manual tweaks here and there. Found it pretty intuitive to work with too thanks!
PS: Help wanted label came on automatically. It might be because initially I chose the "request new operator" issue template.
PS: Help wanted label came on automatically. It might be because initially I chose the "request new operator" issue template.
Oh I see, I have even forgotten that :)
I am very happy that this tool works for you
@ashikns you may be interested in this issue https://github.com/jackwish/tflite2onnx/issues/6 which covers only the fused activation function only.
Hi @ashikns, I accidentally noticed that you have implemented some new operators in your fork. Would you mind to upstream your great work to let them help more people?
Hey! Yes indeed I added a couple for converting a mobilenetv3 network from Google. The reason I haven't sent a PR though is due to two reasons:
- Most of the changes I've made are specific to the model I was converting. I don't remember the specific operators, but for example one operator might have an attribute based on which it's behavior changes but my implementation only handles one specific case.
- I am quite busy at the moment due to upcoming vacations so I haven't had time to create test models. I just did a bit of trial and error till it worked.
That said, I do plan to submit the changes back in the not too distant future. This tool has been very helpful to me and I think it can be very helpful for bringing ML to Windows and other platforms onnx supports. I saw that in a couple other PR/issues there are discussions about Reshape, ConvTranspose, etc. When those get merged I'll rebase mine, clean up and submit back.
Thanks for your reply! Didn't mean to push any but to share joint efforts. Getting things to work should always come first, feel free to come back anytime. And enjoy your vacation! :)
Hi @ashikns , I am planning v0.3.1
release by the end of this year. Please help to open PRs if you are interested in getting your op in this release :)