pytorch-cifar100
pytorch-cifar100 copied to clipboard
Densenet: wrong structure of transition layer
According to original densenet implementation, the transition layer should be BN-ReLU-Conv-Pool
, but the code in this repository is BN-Conv-Pool
. BN-ReLU is missing, which may hurt the accuracy of the model.
the densenet (from paper author): https://github.com/liuzhuang13/DenseNet/blob/cf511e4add35a7d7a921901101ce7fa8f704aee2/models/densenet.lua#L37-L52
this repo: https://github.com/weiaicunzai/pytorch-cifar100/blob/2149cb57f517c6e5fa7262f958652227225d125b/models/densenet.py#L47-L58
by the way, maybe the description in the paper is misleading:
The transitionlayers used in our experiments consist of a batch normal-ization layer and an 1×1 convolutional layer followed by a2×2 average pooling layer.