Yaman Umuroglu
Yaman Umuroglu
Thanks for providing the example models, I can reproduce the issue and I'm looking into a clean solution for this. The origin of the `fix_float64` (which indistriminately replaces `float64` initializers...
I think I found the answer for the question I asked above to @HenniOVP : it's actually not the case that ONNX only accepts 64bit floats as attributes, but rather,...
Thanks for flagging this Jovan. I had a quick look at the testcase, and actually it looks like the problem is not from inside `finn-base` but rather `onnx.shape_inference.infer_shapes` which we...
I had only used Netron to check that the shapes appeared for the intermediate tensors, but if I use `qonnx-exec` I actually see the same problem. The root of this...
I haven't been able to find out why the ONNX PR#2901 does not solve this issue, so I just added a workaround in `ModelWrapper` to do a fix for this...
Hi Jacob, thanks for pointing this out. This repo (`finn-base`) is scheduled to be deprecated soon, and should be replaced in any dependent projects with the new QONNX (https://github.com/fastmachinelearning/qonnx) repo....
Hi Christoph, Thanks for the PR. I agree that this would be a useful cleanup transformation, but one remark after a brief review: since this gets called as part of...
LGTM, ready to merge. Thanks @iksnagreb !
This may be related to on-the-fly modifications of the `model.graph.node` container, I've previously observed some weird behavior too when nodes are being added/removed to the container as some transformation iterates...
What is meant by "tensors are not in-place anymore" here? What is the expected behavior in regards to tensor names?