onnx-tensorflow
onnx-tensorflow copied to clipboard
BackendIsNotSupposedToImplementIt: Unsqueeze version 13 is not implemented.
Right now i am trying to port silero_vad model from Onnx format to TensorFlow with onnx_tf.
However, after .export_graph next error occurs:
BackendIsNotSupposedToImplementIt: in user code:
File "c:\users\rtm51\downloads\onnx-tensorflow\onnx_tf\backend_tf_module.py", line 99, in __call__ *
output_ops = self.backend._onnx_node_to_tensorflow_op(onnx_node,
File "c:\users\rtm51\downloads\onnx-tensorflow\onnx_tf\backend.py", line 347, in _onnx_node_to_tensorflow_op *
return handler.handle(node, tensor_dict=tensor_dict, strict=strict)
File "c:\users\rtm51\downloads\onnx-tensorflow\onnx_tf\handlers\handler.py", line 61, in handle *
raise BackendIsNotSupposedToImplementIt("{} version {} is not implemented.".format(node.op_type, cls.SINCE_VERSION))
BackendIsNotSupposedToImplementIt: Unsqueeze version 13 is not implemented.
You can get the model here
Hello,
I am also getting the same error while converting a yolov5's onnx model to Tensorflow format. BackendIsNotSupposedToImplementIt: Unsqueeze version 13 is not implemented.
Can you please suggest how can I fix this error?
Thank you
Unfortunately the spec change for Unsqueeze version 13 has not been implemented. Contribution is certainly welcome! In the meantime, you could possibly export the model into onnx opset 12 and the conversion should work.
Do you mean that firstly I have to port my model to Onnx format via onnx opset 12 and then use current version of onnx-tensorflow?
Do you mean that firstly I have to port my model to Onnx format via onnx opset 12 and then use current version of onnx-tensorflow?
Yes.
I can't export my pytorch model expect opset_version=14. And when I try to export_graph I get the same error "BackendIsNotSupposedToImplementIt: Squeeze version 13 is not implemented."
Currently, I'm on onnx-tf 1.9.0, onnx 1.10.2, tensorflow 2.7.0.
So, I won't be able to run via onnx opset 12. Is there any workaround possible for me?
I also have the same issue with yolov3.onnx model (from https://github.com/ultralytics/yolov3)
I have the same issue with fairseq transformer translate model.
When I try to export pytorch translation model in the format of onnx(opset_version=12), an error is raised: RuntimeError: Exporting the operator triu to ONNX opset version 12 is not supported. Support for this operator was added in version 14, try exporting with this version. So I have to export the translation model with opset_version=14... However, when I try to convert the onnx model into tensorflow pb checkpoint, it will occur the same error as the arthor.
I experienced the same issue. The model I'm trying to convert to TF must use opset 13 when converted from PT -> ONNX. Thanks so much to @krishnanNuance for working on it!! See here adding squeeze/unsqueeze for opset 13 👍
I think you can close this issue @chinhuang007 😊
Hello, I am facing the same issue. Originally I had an onnx model with opset 14, then I downgraded it to both 12 and 13 but I keep getting the same error. What do I have to do to add squeeze/unsqueeze to the opsets?
Hello, I am facing the same issue. Originally I had an onnx model with opset 14, then I downgraded it to both 12 and 13 but I keep getting the same error. What do I have to do to add squeeze/unsqueeze to the opsets?
you need to add changes from this branch -https://github.com/onnx/onnx-tensorflow/pull/1022
When I try to export pytorch translation model in the format of onnx(opset_version=12), an error is raised: RuntimeError: Exporting the operator triu to ONNX opset version 12 is not supported. Support for this operator was added in version 14, try exporting with this version. So I have to export the translation model with opset_version=14... However, when I try to convert the onnx model into tensorflow pb checkpoint, it will occur the same error as the arthor.
I was in a similar situation.I had to use opset-16 to do the pt->onnx conversion because the node "grid_sample" is only supported in opset-16 . How did you solve the problem?
I can't export my pytorch model expect opset_version=14. And when I try to export_graph I get the same error "BackendIsNotSupposedToImplementIt: Squeeze version 13 is not implemented."
Currently, I'm on onnx-tf 1.9.0, onnx 1.10.2, tensorflow 2.7.0.
So, I won't be able to run via onnx opset 12. Is there any workaround possible for me?
I was in a similar situation.I had to use opset-16 to do the pt->onnx conversion because the node "grid_sample" is only supported in opset-16 . How did you solve the problem?
Currently, I'm on onnx-tf 1.10.0, onnx 1.13.0, tensorflow 2.6.0.
What we should do to solve this issue? https://stackoverflow.com/questions/75969364/backendisnotsupposedtoimplementit-error-converting-onnx-to-tensorflow "BackendIsNotSupposedToImplementIt Error: Converting ONNX to Tensorflow"
I could see some comment in below link that its fixed from onnx developer side. Im running in google colab, but this issue still persist. I cant build onnx_tf from source code in colab i think, instead of 'pip install onnx_tf'. https://github.com/onnx/onnx-tensorflow/pull/1022
I experienced the same issue. The model I'm trying to convert to TF must use opset 13 when converted from PT -> ONNX. Thanks so much to @krishnanNuance for working on it!! See here adding squeeze/unsqueeze for opset 13 👍
Unfortunately the issue persists, even if I'm already using release v1.10.0, where the the pull request #1022 should have been merged.
Same here, using onnx-tf 1.10.0 but still sees this
我处于类似的情况。我必须使用 opset-16 进行 pt->onnx 转换,因为节点“grid_sample”仅在 opset-16 中受支持。然后, BackendIsNotSupposedToImplementIt: Unsqueeze version 13 is not implemented.就报错了 请问如何解决
still facing the same issue using onnx-tf 1.10.0. I tried with opset 13,14,16 but it does not solve the problem. I could not try with opset 12 or lower because I get this error: RuntimeError: D:\a\onnx\onnx\onnx\onnx/version_converter/BaseConverter.h:70: adapter_lookup: Assertion false
failed: No Adapter To Version $12 for Relu
What can I do ?