densenet-tensorflow
densenet-tensorflow copied to clipboard
add_transition layer
I found that in you code: def add_transition(name, l): shape = l.get_shape().as_list() in_channel = shape[3] with tf.variable_scope(name) as scope: l = BatchNorm('bn1', l) l = tf.nn.relu(l) l = Conv2D('conv1', l, in_channel, 1, stride=1, use_bias=False, nl=tf.nn.relu) l = AvgPooling('pool', l, 2) return l
After BN and ReLU, there is a 1*1 conv layer. However, you apply nl=tf.nn.relu, do you mean after conv layer, we still need the operation ReLU? In DenseNet(Caffe version) it is different from your configuration here. Can you explain it to me ? Thanks.
@shirleychangyuanyuan Hello, I have the same question as you. Did you find the answer now? Thanks in advance