densenet-tensorflow icon indicating copy to clipboard operation
densenet-tensorflow copied to clipboard

add_transition layer

Open shirleychangyuanyuan opened this issue 7 years ago • 1 comments

I found that in you code: def add_transition(name, l): shape = l.get_shape().as_list() in_channel = shape[3] with tf.variable_scope(name) as scope: l = BatchNorm('bn1', l) l = tf.nn.relu(l) l = Conv2D('conv1', l, in_channel, 1, stride=1, use_bias=False, nl=tf.nn.relu) l = AvgPooling('pool', l, 2) return l

After BN and ReLU, there is a 1*1 conv layer. However, you apply nl=tf.nn.relu, do you mean after conv layer, we still need the operation ReLU? In DenseNet(Caffe version) it is different from your configuration here. Can you explain it to me ? Thanks.

shirleychangyuanyuan avatar Jan 23 '18 07:01 shirleychangyuanyuan

@shirleychangyuanyuan Hello, I have the same question as you. Did you find the answer now? Thanks in advance

Sirius083 avatar Mar 04 '19 07:03 Sirius083