nmaac
nmaac
Hi @bobo0810 , (1) yes, we simply replace ReLU with FReLU; (2) you do not have to change all ReLU, it is flexible to modify the ReLU network. Specifically, on...
> > > Hi @qbTrible, applying FReLU in the backbone is enough to obtain improvements. For the dataset question, besides the **open datasets** in our experiments, I have assisted others...
@qbTrible , moreover, ResNet50-FReLU even outperforms ResNet-101-ReLU in our experiment, you could also try shallower ResNet with FReLU if you meet an overfitting problem in your task & dataset.
We use gaussian initialization with std=0.01. I simply replace relu with frelu and it shows a slight improvement (0.1~0.3). We note that MobileNetV3 is a NAS-searched optimal CNN architecture, once...
@IzouGend It is ok to use the ImageNet pretrained model because ImageNet is a multi-class classification task.
@zimenglan-sysu-512 yes it does
@fanchunpeng 作用是为了和当时的内部闭源的训练代码框架对齐,不是涨点trick。
> 这里采用了adjust_bn_momentum后,推理的结果和没有采用是有差异的,我转换到onnx后,差异会变大,这种情况下应该怎么处理更好? > 谢谢先 这种情况就删掉 adjust_bn_momentum
The code is exactly the same with the paper (channel shuffle and relu). Please note that relu+avgpool+relu=relu+avgpool.
The last block is followed by a fully connected layer therefore the additional channel shuffle has no effect and can be omitted.