tensorflow-onnx
tensorflow-onnx copied to clipboard
Unable to convert node FakeQuantWithMinMaxArgs with num_bits=10?
Ask a Question
Hello all,when I convert a tensorflow QAT model into onnx with tf2onnx, there is a error:
raise ValueError("make_sure failure: " + error_msg % args) ValueError: make_sure failure: Unable to convert node FakeQuantWithMinMaxArgs with num_bits=10
So I wonder we haven`t support this bitwidth so far? Or I mde some mistakes? thanks!
Currently, only num_bits=8 is supported.
Currently, only num_bits=8 is supported.
Do we have a plan of support? Because some networks require support for different bit widths at times.