DenseNetCaffe
DenseNetCaffe copied to clipboard
Why not use BatchNorm in-place, any concern?
Hi @WenzhMicrosoft , sorry, what do you mean by in-place BatchNorm? We know Torch supports in-place ReLU, but we're not aware of in-place BatchNorm layer.
Sorry, I thought the issue was opened on our Torch repo. Please ignore the comment above.
The reason is that in-place BatchNorm layers will overwrite the incoming feature maps, which will actually be used by later BatchNorm layers.
Thanks for quick reply! It makes sense, I wonder if I can apply in-place BatchNorm on transition layers and the first BatchNorm layer(right after data layer and first conv layer)
@WenzhMicrosoft : I can answer the first point. The in-place BatchNorm cannot apply in transition layer because before it is a Concat layer. For the second point, I think we can use in-place=true for saving memory, but I am not sure about this point
@John1231983, I don't understand why the in-place BatchNorm can't apply after Concat layer, could you explain a little bit?
@WenzhMicrosoft: I am using caffe and i confirm that it cannot. I do not know the reason. May be caffe did not support