MONAI
MONAI copied to clipboard
get_norm_layer default to affine=True
https://github.com/Project-MONAI/MONAI/issues/5222
Sets the default normalization parameter (affine) to True.
Most normalizations in pytorch by default estimates the affine parameters (e.g. Batchnorm, Groupnorm).
One notable exception is Instancenorm (for which default is affine=False), and if we don't specify it manually, it won't estimate these parameters, e.g. this won't have any learnable parameters
get_norm_layer(name="instance", spatial_dims=2)
and one needs to use a verbose input
get_norm_layer(name=("instance", {"affine":True}, spatial_dims=2)
this change makes affine (parameters) default to True for all normalizations, unless it's explicitly set to False. This is the preferred way and should be the default for all segmentation/classification tasks.
Description
A few sentences describing the changes proposed in this pull request.
Types of changes
- [x] Non-breaking change (fix or new feature that would not break existing functionality).
- [ ] Breaking change (fix or new feature that would cause existing functionality to change).
- [ ] New tests added to cover the changes.
- [ ] Integration tests passed locally by running
./runtests.sh -f -u --net --coverage
. - [ ] Quick tests passed locally by running
./runtests.sh --quick --unittests --disttests
. - [ ] In-line docstrings updated.
- [ ] Documentation updated, tested
make html
command in thedocs/
folder.
Looks like it failed a few tests. One of them is the number of parameters for Unet
Looks like our Unet implementation has Instancenorm as the default normalization https://github.com/Project-MONAI/MONAI/blob/dev/monai/networks/nets/unet.py#L124
as well as many other places (e.g Convolution helper func) https://github.com/Project-MONAI/MONAI/blob/ba670c9ba78f9f4d76eb12b4d33f245b5774d462/monai/networks/blocks/acti_norm.py#L105
but this seems wrong , Instancenorm in Unet must have learnable (affine) parameters, and current implementation does not.
Alright, looks like this doesn't have support for legacy reasons, I'm going to close this