FastFlow icon indicating copy to clipboard operation
FastFlow copied to clipboard

Unmentioned but critical LayerNorm

Open gathierry opened this issue 3 years ago • 5 comments

To achieve comparable result as the original paper. LayerNorm is applied to the feature before NF. This is never mentioned in the paper and the usage is very tricky (but this is the only way works for me):

  • resnet18 and wide-resnet-50: use trainable LayerNorm
  • CaiT and DeiT: use the final norm from the pre-trained model and fix it's affine parameters

gathierry avatar Mar 18 '22 12:03 gathierry

I measured the performances of models without LayerNorm parts. In both renset18 and wide-resnet50, AUROC was quite similar, sometimes even better the original ones. Also DeiT showed comparable performances. (lower as 0.03~0.05) However in CaiT, the loss was crazily high and AUROC was 0.5! I can't understand why these models show different results depending on Layer Normalization.

cytotoxicity8 avatar Apr 16 '22 07:04 cytotoxicity8

image The red one is w/o elementwise-affine. I am experimenting to advance FastFlow, discussion is always open.

cytotoxicity8 avatar May 28 '22 08:05 cytotoxicity8

use x = x.flatten(2).transpose(1, 2) to reshape the featuremap BCHW -->B,N,C,thus layerNorm don't depend the input size

AncientRemember avatar Sep 23 '22 11:09 AncientRemember

maybe use BN after conv2d will work

AncientRemember avatar Sep 23 '22 11:09 AncientRemember

Well, after learning more about transformers, I realize that adding LayerNorm to intermediate output feature maps is very commom, such as applying transformers as the backbone in semantic segmentation (https://github.com/SwinTransformer/Swin-Transformer-Semantic-Segmentation/blob/87e6f90577435c94f3e92c7db1d36edc234d91f6/mmseg/models/backbones/swin_transformer.py#L620). So I guess that's why the paper never mentioned.

And for resnet, maybe LayerNorm is not necessary as pointed out by @cytotoxicity8 .

gathierry avatar Sep 23 '22 12:09 gathierry