pytorch-vgg-cifar10
pytorch-vgg-cifar10 copied to clipboard
What is the model_best.pth file's model?
I need Vgg16 pretrained model. but there is error like this.
RuntimeError: Error(s) in loading state_dict for DataParallel: Missing key(s) in state_dict: "module.features.0.bias", "module.features.0.weight", "module.features.2.bias", "module.features.2.weight", "module.features.5.bias", "module.features.5.weight", "module.features.7.bias", "module.features.7.weight", "module.features.10.bias", "module.features.10.weight", "module.features.12.bias", "module.features.12.weight", "module.features.14.bias", "module.features.14.weight", "module.features.17.bias", "module.features.17.weight", "module.features.19.bias", "module.features.19.weight", "module.features.21.bias", "module.features.21.weight", "module.features.24.bias", "module.features.24.weight", "module.features.26.bias", "module.features.26.weight", "module.features.28.bias", "module.features.28.weight", "module.classifier.1.bias", "module.classifier.1.weight", "module.classifier.4.bias", "module.classifier.4.weight", "module.classifier.6.bias", "module.classifier.6.weight". Unexpected key(s) in state_dict: "state_dict", "best_prec1", "epoch".
What is the exact model?
I met the same problem
The model is vgg19! I just got it working on my end (I did not use DataParallel though). I had to tweak the checkpoint to match the keys as in the VGG model description (in the vgg.py file). This is what I did:
checkpoint_orig = torch.load(<checkpoint_file_pth.tar>)
checkpoint = {'state_dict':{}}
for k in checkpoint_orig['state_dict'].keys():
checkpoint['state_dict'][k.replace('module.','')] = checkpoint_orig['state_dict'][k]
checkpoint['epoch'] = checkpoint_orig['epoch'] # This line is optional
checkpoint['best_prec1'] = checkpoint_orig['best_prec1'] # This line is optional
del checkpoint_orig # This is to save memory space
model.load_state_dict(checkpoint['state_dict'])
I could replicate 92.43% accuracy on the CIFAR-10 testset.
Could you please share your test.py? Thanks a lot.