densenet.pytorch
densenet.pytorch copied to clipboard
Help needed on reproducing the performance on Cifar-100
I used the default setting(which I think is Densenet-12-BC with data augmentation) on cifar-100(via just changing the name of dataset class and the nClasses variable). The training curve looks like this:
Though the training has not ended yet, from training curves for other networks on Cifar-100 I can tell there would be no more major changes in acc. The highest of acc for now is 75.59%, which can only match the reported performance of Densenet-12(depth 40) with data augmentation.
Has any one tested this repo on Cifar-100 yet?
No changes in the end.
Hi @wishforgood I have tried another reimplementation and meet the same problem. The error rate on CIFAR10 with densenet40-nonBC is only 6.0% (5.24% in the repo) but when I test it on Tensorflow it is about 5.4%. I think it is caused by Pytorch instead of the model Have you solved that yet?
Not yet, at last I decided to try other models like wide-resnet.