model-optimization icon indicating copy to clipboard operation
model-optimization copied to clipboard

Failed quantization of dilated convolution layers: tensorflow or tensorflow-model-optimization bug?

Open Ebanflo42 opened this issue 1 year ago • 0 comments

Describe the bug Tensorflow model optimization fails to quantize dilated convolution layers.

System information

TensorFlow version (installed from source or binary): source

TensorFlow Model Optimization version (installed from source or binary): source

Python version: 3.10.12

Describe the expected behavior

Quantizing dilated convolutions should be essentially the same as any other layer.

Describe the current behavior

Either tf or tfmot is silently failing. There is the following very old issue describing exactly this:

https://github.com/tensorflow/tensorflow/issues/26797

There is a slightly newer open issue showing that this was never resolved:

https://github.com/tensorflow/tensorflow/issues/53025

I am not 100% certain, but it seems like these issues are misplaced and should be designated as model-optimization issues.

There seems to be a workaround via using tf.nn.conv2d instead of tf.keras.layers.Conv2D, but as far as I can tell this would require layer subclassing which, based on other issues, is still buggy when it comes to quantization.

Code to reproduce the issue See afformentioned issues.

Ebanflo42 avatar May 13 '24 14:05 Ebanflo42