segmentation_models icon indicating copy to clipboard operation
segmentation_models copied to clipboard

freezing batch normalization layer

Open f-grimaldi opened this issue 4 years ago • 1 comments

Hi, I've just starting working on semantic segmentation with tensorflow and this package have helped so much, so thanks a lot for the great work!

There is just one thing that I don't understand and I find potentially wrong: the model do not freeze the BN encoder layers when we freeze the encoder.

For example I'm using UNet with pre-trained backbones as feature extractor, so what I do is the following: model = Unet(backbone, input_shape=shape, classes=n_classes, activation=mode, encoder_freeze=True) the problem here is that the previous line of code with setting 'encoder_freeze = True' calls models._utils.freeze_model which does not freeze the running mean and variance but instead does soemthing very similar to the the following line: layer.trainable = False for layer in model.layers if not isinstance(layer, layers.BatchNormalization)

So in the end is this a wanted behavior? Because according to [keras guides] (https://keras.io/guides/transfer_learning/) we would like to freeze also the BN running mean and var when using the encoder just as a feature extractor (weights are freezed). So in the end when the encoder is freezed i'd expect something similar to this:

features = encoder.predict(x) #inference mode out = decoder(features) #training mode

Thank you for your time!

f-grimaldi avatar Feb 03 '21 15:02 f-grimaldi

I'm also wondering this!

murphp30 avatar Jun 29 '22 13:06 murphp30