NiftyNet
NiftyNet copied to clipboard
Layers should allow more options than just 'batch norm' and 'group norm' (instance norm?)
Layers have a with_bn
option that is over-ridden for group normalization by having a positive group size. Instead of this flag, we should have a bn_type
string that determines which type of normalization to apply.
I find the variable name bn_type
confusing if you end up not using BN... Maybe replace it with featnorm_type
or something along these lines?
Should we take the opportunity of this PR to address #285 at the same time?
In the spirit of TF, we could also maybe go for feature_normalization
as a flag name (see discussions in https://github.com/NifTK/NiftyNet/pull/282).
fixing #285 will break some of the model zoo items because of the variable name scopes... So we need another PR for #285, probably updating the model zoo items as well.
To my understanding, Instance Normalization is a special case of Group Normalization when group_size is equal to 1. So one can already use Instance Normalization in the current setting.
On this line, maybe the class InstanceNormLayer in niftynet.layer.bn could be removed or at least marked as deprecated.
I agree the flags could be made more clear though.