text-classification-demos icon indicating copy to clipboard operation
text-classification-demos copied to clipboard

problem about tf.layers.batch_normalization

Open xljhtq opened this issue 5 years ago • 4 comments

Hi, in the DPCNN code, I find that the function "tf.layers.batch_normalization" is used when training the model. But the parameter "training" is not set True! e.g conv3 = tf.layers.batch_normalization(conv3, training=True) So I want to know whether I understand it or your coding errors?

xljhtq avatar Mar 06 '19 14:03 xljhtq

Thanks for your reminder, the way of using the "tf.layers.batch_normalization" in my code is error, I will correct it as soon as possible, and I feel so sorry for that, thanks again!!!

liyibo avatar Mar 07 '19 01:03 liyibo

@liyibo Hi, There is another problem I want to know. Because the paper says that ''pre-activation refers to activation being done before weighting instead of after as is typically done '', so your code differs from the idea e.g conv3 = tf.layers.conv2d(activation=tf.nn.relu) Here, the activation in the function tf.layers.conv2d is done before weighting or after weighting? I feel that it is done after weighting. That's the problem. Hope to get your reply!

xljhtq avatar Mar 07 '19 03:03 xljhtq

You are right, the true order may as follows: pre_activation the region_embedding and then do conv without relu, after two layer of conv , add the region_embedding and the conv result together, then do the following steps. Thanks again and please feel free to point out the errors in the code!!!

liyibo avatar Mar 07 '19 10:03 liyibo

I want to know which tensorflow version

dengxiaotian123 avatar Jan 13 '20 14:01 dengxiaotian123