pytorch-segmentation-detection
pytorch-segmentation-detection copied to clipboard
Error(s) in loading state_dict for Resnet18_8s
@warmspringwinds I am getting the following error for resnet_18_8s_59.pth
RuntimeError: Error(s) in loading state_dict for Resnet18_8s: size mismatch for resnet18_8s.fc.bias: copying a param with shape torch.Size([2]) from checkpoint, the shape in current model is torch.Size([21]). size mismatch for resnet18_8s.fc.weight: copying a param with shape torch.Size([2, 512, 1, 1]) from checkpoint, the shape in current model is torch.Size([21, 512, 1, 1]).
If I change num_classes=21 to num_classes=2, it generates the output without any segmentation (purple screen)
Hi,
I might have messed up the model -- I will have a look at it. Could you try the resnet 34 8s instead ? -- that should work
Thank you.
@warmspringwinds Tried that as well. Getting the below error.
RuntimeError: Error(s) in loading state_dict for Resnet34_8s: Missing key(s) in state_dict: "resnet34_8s.conv1.weight", "resnet34_8s.bn1.running_var", "resnet34_8s.bn1.bias", "resnet34_8s.bn1.weight", "resnet34_8s.bn1.running_mean", "resnet34_8s.layer1.0.conv1.weight", "resnet34_8s.layer1.0.bn1.running_var", "resnet34_8s.layer1.0.bn1.bias", "resnet34_8s.layer1.0.bn1.weight", "resnet34_8s.layer1.0.bn1.running_mean", "resnet34_8s.layer1.0.conv2.weight", "resnet34_8s.layer1.0.bn2.running_var", "resnet34_8s.layer1.0.bn2.bias", "resnet34_8s.layer1.0.bn2.weight", "resnet34_8s.layer1.0.bn2.running_mean", "resnet34_8s.layer1.1.conv1.weight", "resnet34_8s.layer1.1.bn1.running_var", "resnet34_8s.layer1.1.bn1.bias", "resnet34_8s.layer1.1.bn1.weight", "resnet34_8s.layer1.1.bn1.running_mean", "resnet34_8s.layer1.1.conv2.weight", "resnet34_8s.layer1.1.bn2.running_var", "resnet34_8s.layer1.1.bn2.bias", "resnet34_8s.layer1.1.bn2.weight", "resnet34_8s.layer1.1.bn2.running_mean", "resnet34_8s.layer1.2.conv1.weight", "resnet34_8s.layer1.2.bn1.running_var", "resnet34_8s.layer1.2.bn1.bias", "resnet34_8s.layer1.2.bn1.weight", "resnet34_8s.layer1.2.bn1.running_mean", "resnet34_8s.layer1.2.conv2.weight", "resnet34_8s.layer1.2.bn2.running_var", "resnet34_8s.layer1.2.bn2.bias", "resnet34_8s.layer1.2.bn2.weight", "resnet34_8s.layer1.2.bn2.running_mean", "resnet34_8s.layer2.0.conv1.weight", "resnet34_8s.layer2.0.bn1.running_var", "resnet34_8s.layer2.0.bn1.bias", "resnet34_8s.layer2.0.bn1.weight", "resnet34_8s.layer2.0.bn1.running_mean", "resnet34_8s.layer2.0.c... Unexpected key(s) in state_dict: "resnet18_8s.conv1.weight", "resnet18_8s.bn1.weight", "resnet18_8s.bn1.bias", "resnet18_8s.bn1.running_mean", "resnet18_8s.bn1.running_var", "resnet18_8s.bn1.num_batches_tracked", "resnet18_8s.layer1.0.conv1.weight", "resnet18_8s.layer1.0.bn1.weight", "resnet18_8s.layer1.0.bn1.bias", "resnet18_8s.layer1.0.bn1.running_mean", "resnet18_8s.layer1.0.bn1.running_var", "resnet18_8s.layer1.0.bn1.num_batches_tracked", "resnet18_8s.layer1.0.conv2.weight", "resnet18_8s.layer1.0.bn2.weight", "resnet18_8s.layer1.0.bn2.bias", "resnet18_8s.layer1.0.bn2.running_mean", "resnet18_8s.layer1.0.bn2.running_var", "resnet18_8s.layer1.0.bn2.num_batches_tracked", "resnet18_8s.layer1.1.conv1.weight", "resnet18_8s.layer1.1.bn1.weight", "resnet18_8s.layer1.1.bn1.bias", "resnet18_8s.layer1.1.bn1.running_mean", "resnet18_8s.layer1.1.bn1.running_var", "resnet18_8s.layer1.1.bn1.num_batches_tracked", "resnet18_8s.layer1.1.conv2.weight", "resnet18_8s.layer1.1.bn2.weight", "resnet18_8s.layer1.1.bn2.bias", "resnet18_8s.layer1.1.bn2.running_mean", "resnet18_8s.layer1.1.bn2.running_var", "resnet18_8s.layer1.1.bn2.num_batches_tracked", "resnet18_8s.layer2.0.conv1.weight", "resnet18_8s.layer2.0.bn1.weight", "resnet18_8s.layer2.0.bn1.bias", "resnet18_8s.layer2.0.bn1.running_mean", "resnet18_8s.layer2.0.bn1.running_var", "resnet18_8s.layer2.0.bn1.num_batches_tracked", "resnet18_8s.layer2.0.conv2.weight", "resnet18_8s.layer2.0.bn2.weight", "resnet18_8s.layer2.0.bn2.bias", "resnet1...
Seems like you are trying to init 34 model and load weights of 18 model.
Try loading the weights of 34 model -- I took it from the table: https://www.dropbox.com/s/91wcu6bpqezu4br/resnet_34_8s_68.pth?dl=0
@warmspringwinds Using 34 model worked but the result is very poor. Any specific reason?

Make sure you changed your model into eval mode
Also this is not the best model -- try psp
can't find a demo file or model link for psp
I will upload them a bit later
For now you can try reducing the size of the input image. This will help since network was trained on images of 380 size on average.
On Sat, Aug 24, 2019 at 1:00 PM sagar1garg [email protected] wrote:
can't find a demo file or model link for psp
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/warmspringwinds/pytorch-segmentation-detection/issues/23?email_source=notifications&email_token=AATCWB3RSYC5PY4VJOAXUW3QGFSLBA5CNFSM4IPGS4FKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD5CD2QA#issuecomment-524565824, or mute the thread https://github.com/notifications/unsubscribe-auth/AATCWBYC6MPWKSHFUUQHAMDQGFSLBANCNFSM4IPGS4FA .