addons
addons copied to clipboard
Need more specific padding behavior for convolution layers
Describe the feature and the current behavior/state.
Convolution layer in keras API only support "same", "valid", "causal" and "full" padding. However, these padding methods is different in shape-wise but same in numerical-wise, i.e., they are all zero-padding. If we need "reflect" padding, or "symmetric" padding, just like 'tf.pad()', it will be very inconvenient.
So,I want to make a wrapper called "ConvPadConcretization", where "ConvPad" means the concerned object and "Concretization" means the operation. Then, users can specify more specific padding behavior for convolution layers, such as:
x = tf.constant([1.,2.,3.,4.,5.],shape=[1,5,1])
conv1d_ = tf.keras.layers.Conv1D(filters=1, kernel_size=[2,]*1, strides=(1,)*1, padding="same")
conv1d = ConvPadConcretization(conv1d_, padding_mode='constant',padding_constant=1)
y = conv1d(x)
Relevant information
- Are you willing to contribute it (yes/no): yes
- Are you willing to maintain it going forward? (yes/no): yes
Which API type would this fall under (layer, metric, optimizer, etc.)
layer
Who will benefit with this feature?
One who often use Conv1D, Conv2D or Conv3D in tf.keras.layers
Principle.
Consider a tf.keras.layers.Conv1D layer with 'same' padding This wrapper will mimic this layer's procedure by 2 procedures:
- First, mimic layer's padding behavior by
tf.pad()and pad inputs - Second, operate a convolution with 'valid' padding on the padded inputs
So the wrapped layer's output will equal to original layer's output in shape-wise.
More specific padding behavior can be injected into wrapper's first procedures's
tf.pad()
It is dificult in first procedures since keras's padding behavior is hidden in generated C++ opetrations, like Conv2dBackpropInput.
Luckly, I finally find the principle of keras's padding behavior and realize the mimicking.
I have dealed with this wrapper seriously. I have tested thoroughly and promise this wrapper maintains original layer's shape-wise behavior ant only change the numerical-wise behavior. So, if one specify a conv layer by "constant" padding with "0", the output will equal to the original layer.