UNetPlusPlus icon indicating copy to clipboard operation
UNetPlusPlus copied to clipboard

when training model resnet34+xnet configuration, the default decoder_filter parameters seems not work

Open brb-chen opened this issue 5 years ago • 0 comments

main train script configuration to: model = Xnet(backbone_name=config.backbone, input_shape=(config.input_deps, config.input_rows, config.input_cols), n_upsample_blocks=4, decoder_filters=(64,64,128,256,512), encoder_weights=config.weights, decoder_block_type=config.decoder_block_type, classes=config.nb_class, activation=config.activation)

and builder.py in xnet model to: if downterm[i+1] is not None: #interm[(n_upsample_blocks+1)*i+j+1] = up_block(decoder_filters[n_upsample_blocks-i-2], interm[(n_upsample_blocks+1)*i+j+1] = up_block(decoder_filters[i], i+1, j+1, upsample_rate=upsample_rate, skip=interm[(n_upsample_blocks+1)*i+j], use_batchnorm=use_batchnorm)(downterm[i+1]) else: interm[(n_upsample_blocks+1)*i+j+1] = None # print("\n{} = {} + {}\n".format(interm[(n_upsample_blocks+1)*i+j+1], # interm[(n_upsample_blocks+1)*i+j], # downterm[i+1])) else: #interm[(n_upsample_blocks+1)*i+j+1] = up_block(decoder_filters[n_upsample_blocks-i-2], interm[(n_upsample_blocks+1)*i+j+1] = up_block(decoder_filters[i], i+1, j+1, upsample_rate=upsample_rate, skip=interm[(n_upsample_blocks+1)*i : (n_upsample_blocks+1)*i+j+1], use_batchnorm=use_batchnorm)(interm[(n_upsample_blocks+1)*(i+1)+j]) # print("\n{} = {} + {}\n".format(interm[(n_upsample_blocks+1)*i+j+1], # interm[(n_upsample_blocks+1)*i : (n_upsample_blocks+1)*i+j+1], # interm[(n_upsample_blocks+1)*(i+1)+j]))

when i adapt resnet34 as backbone when try to training xnet model with my own data(5125123), after detailed checked the downsampling layer and skip-connection layers, and the up-block(transpose currently), it's seems the default decoder filter parameters didn't work, as the concatenate operation the up-block require the input as same dimension, so after checked the details network and skip-connection configuration, i changed the decoder filters, did anyone meet the same situation? just to confirm that, thanks!

brb-chen avatar Jan 18 '20 14:01 brb-chen