wide-residual-networks icon indicating copy to clipboard operation
wide-residual-networks copied to clipboard

3.8% and 18.3% on CIFAR-10 and CIFAR-100

Results 24 wide-residual-networks issues
Sort by recently updated
recently updated
newest added

I ran with default parameters, and I'm getting a Can't pickle error. Is this perhaps a version problem? Number of model parameters: 36479194 Traceback (most recent call last): File "train.py",...

Hi, Recently, i read the articles about wide residual networks. I found that the latest version top1err is slightly better than the first version. 4.00% vs 4.17%, 4.27% vs 4.81%,4.53%...

I think it is cheated. I have tried WRN-16-8-dropout on SVHN (don't use extra.mat) and get accuracy at around 96.2%.

In contrast to cifar code, for imagenet the number of blocks in groups is not calculated automatically, but provided manually, where the third group gets the most residual blocks as...

Hoe come there are 53 convolution layers in the 50-2 network? I expected to see 50 layers. What am I missing? Thanks.

Hi, szagoruyko, in your paper, I know dropout is useful for ResNet, but you don't mention how to set up the dropout rate. Can you tell me the principles of...

I'm running this command: `model=wide-resnet widen_factor=4 depth=40 dropout=0.3 ./scripts/debug_cifar.sh` Most of the time (80%+), the program will reach the point where it prints this: `Network has 40 convolutions` `Will save...

could you offer cifar100_wrn weights,please?

looks like WRN need too much GPU memory, I tried image size of 224x224, and even with depth 10 (1x6 + 4), it run out of GPU memory. I wonder...

Hi. Thank you for sharing your work. I was trying to run your training code (python main.py --save ./logs/resnet_$RANDOM$RANDOM --depth 28 --width 10) but I keep getting the following error:...