PT0X0E
Results
1
comments of
PT0X0E
Could be wrong normalization layer. Transformer models use LayerNorm. So don't set `cfg.model.backbone.norm_cfg` to BatchNorm.
PT0X0E
Could be wrong normalization layer. Transformer models use LayerNorm. So don't set `cfg.model.backbone.norm_cfg` to BatchNorm.