wonnx icon indicating copy to clipboard operation
wonnx copied to clipboard

Implement the ConvTranspose operation

Open mayjs opened this issue 1 year ago • 7 comments

I have a first draft of a working ConvTranspose implementation. Still want to add support for padding and maybe groups before we merge this. Also should probably handle some more edgecases for unsupported attribute values to at least throw an informative error.

mayjs avatar Aug 02 '23 20:08 mayjs

Okay, I think we have the most important features of ConvTranspose supported now.

We are missing support for 1D and 3D, output_padding, auto_pad, dilations, and group. I'm also not sure if this would work with anything other than f32, so I put in a check for the datatype as well.

@pixelspark I didn't run the ONNX backend tests locally, so we'll have to see if the CI check passes.

Do you want me to annotate the limitations in the support table in the README?

mayjs avatar Aug 12 '23 19:08 mayjs

Also, as I mentioned before, this was the missing step for me to run UNet. I have successfully used this to run a pretrained nind-denoise model. We could at a test for UNet to repository as you already suggested in !183, but I'm not sure how we would handle the rather large ONNX file.

Creating a randomized graph might be a nice way to test the functionality but won't provide any further use to anyone.

mayjs avatar Aug 12 '23 19:08 mayjs

@mayjs backend test appears to have passed?

Re the UNet sample: how large would the download size be for running the example/test? I might be able to host the files (I have a VPS with some traffic budget).

pixelspark avatar Aug 13 '23 09:08 pixelspark

@pixelspark Hm, looking through the log files (https://github.com/webonnx/wonnx/actions/runs/5843079870/job/15844866120?pr=182), ConvTranspose is not listed in the loaded / supported operations. Not sure where the operations are registered, but it's looking like the ConvTranspose backend tests are not running. I'll look into that.

The model I'm using is about 57MB. I plan to ship it as a part of NeuraTable in the future, so I'll need to figure out a solution for deployment anyway. My current plan is to create a fork of nind-denoise and upload the pretrained ONNX file as a release artifact, so maybe a test runner for wonnx could also just pull it from there. Another issue is the license; I'm not sure what the implications would be if a wonnx test loads a GPL licensed model during testing - strictly speaking, the model is not integrated into the software, but I'm really unsure about the legal situation here.

mayjs avatar Aug 13 '23 11:08 mayjs

Hm, maybe I'm misunderstanding something. I thought that adding the test cases to test_onnx_backend.py would let them run in the CI. I took the name of the tests from here: https://github.com/onnx/onnx/blob/rel-1.13.1/onnx/backend/test/case/node/convtranspose.py

Any idea why the test is not executed? Or am I just not reading the output correctly?

mayjs avatar Aug 13 '23 15:08 mayjs

Probably the easiest way to fix this would be to run the tests locally... I can try later if necessary.

pixelspark avatar Aug 15 '23 15:08 pixelspark

Probably the easiest way to fix this would be to run the tests locally... I can try later if necessary.

Yeah I guess that would be easier - I didn't have time to setup the required Python environment and I suspect that it might take a bit of work for me since I'm on NixOS, so if you have time to take a look at this that would be great :)

mayjs avatar Aug 15 '23 20:08 mayjs