UNetPlusPlus
UNetPlusPlus copied to clipboard
Deep supervision
Hi @MrGiovanni ,thank you for sharing your code. I have a question about the implement of the Deep supervision structure. In the paper, you said that the final output of the model is the average of the 4 outputs of the branches. I calculate the average output feature map in this way but got the error below:
nestnet_output_1 = Conv2D(num_class, (1, 1), activation='sigmoid', name='output_1', kernel_initializer = 'he_normal', padding='same', kernel_regularizer=l2(1e-4))(conv1_2)
nestnet_output_2 = Conv2D(num_class, (1, 1), activation='sigmoid', name='output_2', kernel_initializer = 'he_normal', padding='same', kernel_regularizer=l2(1e-4))(conv1_3)
nestnet_output_3 = Conv2D(num_class, (1, 1), activation='sigmoid', name='output_3', kernel_initializer = 'he_normal', padding='same', kernel_regularizer=l2(1e-4))(conv1_4)
nestnet_output_4 = Conv2D(num_class, (1, 1), activation='sigmoid', name='output_4', kernel_initializer = 'he_normal', padding='same', kernel_regularizer=l2(1e-4))(conv1_5)
nestnet_output_all = (nestnet_output_1+nestnet_output_2+nestnet_output_3+nestnet_output_4)/4
if deep_supervision:
# model = Model(input=img_input, output=[nestnet_output_1,
# nestnet_output_2,
# nestnet_output_3,
# nestnet_output_4])
model = Model(input=img_input, output=[nestnet_output_all])
else:
model = Model(input=img_input, output=[nestnet_output_4])
`
Using TensorFlow backend.
D:\code\unet-master\revisedModel.py:116: UserWarning: Update your Model
call to the Keras 2 API: Model(outputs=[<tf.Tenso..., inputs=Tensor("ma...)
model = Model(input=img_input, output=[nestnet_output_all])
Traceback (most recent call last):
File "D:/code/unet-master/revisedModelTrain.py", line 16, in Layer
(thus holding past layer metadata). Found: Tensor("truediv:0", shape=(?, 256, 256, 1), dtype=float32)
Can you help about this, thanks!
Hello, I have the same problem as you. How did you solve it? @twentyfiveYang
Try this. `
nestnet_output_all = keras.layers.Average()([nestnet_output_1,nestnet_output_2,nestnet_output_3,nestnet_output_4])
if deep_supervision:
model = Model(inputs=[img_input], outputs=[nestnet_output_all])
else:
model = Model(inputs=img_input, outputs=[nestnet_output_4])`
In helper_functions.py : He writes as this:
if deep_supervision: model = Model(input=img_input, output=[nestnet_output_1, nestnet_output_2, nestnet_output_3, nestnet_output_4])
else: model = Model(input=img_input, output=[nestnet_output_4])
I have the same question.
Hi @Einshowstank
The deep supervision code is as below (you don't need to average outputs before computing loss):
% Architecture definition
def UNetPlusPlus(img_rows, img_cols, color_type=1, num_class=1, connection='concatenation', deep_supervision=False):
... ...
nestnet_output_1 = Conv2D(num_class, (1, 1), activation='sigmoid', name='output_1', kernel_initializer = 'he_normal', padding='same', kernel_regularizer=l2(1e-4))(conv1_2)
nestnet_output_2 = Conv2D(num_class, (1, 1), activation='sigmoid', name='output_2', kernel_initializer = 'he_normal', padding='same', kernel_regularizer=l2(1e-4))(conv1_3)
nestnet_output_3 = Conv2D(num_class, (1, 1), activation='sigmoid', name='output_3', kernel_initializer = 'he_normal', padding='same', kernel_regularizer=l2(1e-4))(conv1_4)
nestnet_output_4 = Conv2D(num_class, (1, 1), activation='sigmoid', name='output_4', kernel_initializer = 'he_normal', padding='same', kernel_regularizer=l2(1e-4))(conv1_5)
if deep_supervision:
model = Model(input=img_input, output=[nestnet_output_1,
nestnet_output_2,
nestnet_output_3,
nestnet_output_4])
else:
model = Model(input=img_input, output=[nestnet_output_4])
return model
% Model compiling
if config.deep_supervision:
model.compile(optimizer='Adam',
loss={'output_1': bce_dice_loss, 'output_2': bce_dice_loss, 'output_3': bce_dice_loss, 'output_4': bce_dice_loss},
metrics={'output_1': dice_coef, 'output_2': dice_coef,
'output_3': dice_coef, 'output_4': dice_coef},
loss_weights={'output_1': 1., 'output_2': 1., 'output_3': 1., 'output_4': 1.})
else:
model.compile(optimizer='Adam',
loss=bce_dice_loss,
metrics=['binary_crossentropy', mean_iou, dice_coef])
Thanks,
Zongwei