Fail to fuse ReLU6 after QAT by using keras.layers.Activation('relu6')
Describe the bug
ReLU6 cannot be fused into Conv2D (Conv2D+BN+ReLU6) after QAT by using keras.layers.Activation('relu6')
Following works fine:
keras.layers.ReLU(6)keras.layers.Activation('relu')keras.layers.ReLU()
So the workaround is to stick to keras.layers.ReLU(6)
System information
TensorFlow version (installed from source or binary): 2.9.2 (colab default)
TensorFlow Model Optimization version (installed from source or binary): 0.7.3 (pip default)
Python version: 3.8 (colab default)
Describe the expected behavior
keras.layers.Activation('relu6') should be fused into Conv2D as same as keras.layers.Activation('relu')
Describe the current behavior
keras.layers.Activation('relu6') failed to be fused into Conv2D
Code to reproduce the issue Colab: https://colab.research.google.com/drive/1tuGvsuBsUiWUdks_i9glXgdUoUcFSXqI
Screenshots
keras.layers.Activation('relu6')

keras.layers.ReLU(6)

Additional context
It is convenient for users to configure the model's activation by keras.layers.Activation(...). It might take some time to detect the strange behavior by using keras.layers.Activation('relu6').
Thanks!
Thanks for reporting the bug. @Xhark Could you take a look?