DenseNetCaffe icon indicating copy to clipboard operation
DenseNetCaffe copied to clipboard

Why not use BatchNorm in-place, any concern?

Open WenzhMicrosoft opened this issue 8 years ago • 6 comments

WenzhMicrosoft avatar Jul 27 '17 11:07 WenzhMicrosoft

Hi @WenzhMicrosoft , sorry, what do you mean by in-place BatchNorm? We know Torch supports in-place ReLU, but we're not aware of in-place BatchNorm layer.

liuzhuang13 avatar Jul 27 '17 11:07 liuzhuang13

Sorry, I thought the issue was opened on our Torch repo. Please ignore the comment above.

The reason is that in-place BatchNorm layers will overwrite the incoming feature maps, which will actually be used by later BatchNorm layers.

liuzhuang13 avatar Jul 27 '17 11:07 liuzhuang13

Thanks for quick reply! It makes sense, I wonder if I can apply in-place BatchNorm on transition layers and the first BatchNorm layer(right after data layer and first conv layer)

WenzhMicrosoft avatar Jul 27 '17 11:07 WenzhMicrosoft

@WenzhMicrosoft : I can answer the first point. The in-place BatchNorm cannot apply in transition layer because before it is a Concat layer. For the second point, I think we can use in-place=true for saving memory, but I am not sure about this point

John1231983 avatar Jul 27 '17 12:07 John1231983

@John1231983, I don't understand why the in-place BatchNorm can't apply after Concat layer, could you explain a little bit?

WenzhMicrosoft avatar Jul 27 '17 12:07 WenzhMicrosoft

@WenzhMicrosoft: I am using caffe and i confirm that it cannot. I do not know the reason. May be caffe did not support

John1231983 avatar Jul 27 '17 13:07 John1231983