mobilenets-ssd-pytorch icon indicating copy to clipboard operation
mobilenets-ssd-pytorch copied to clipboard

test code only work for original label map pf 21 element

Open UcefMountacer opened this issue 3 years ago • 1 comments

Hi,

create_mobilenetv2_ssd_lite has a problem when using label map of 11 element.

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
[<ipython-input-18-eff74db7755d>](https://localhost:8080/#) in <module>()
     14 net = create_mobilenetv2_ssd_lite(11, is_test=1)
     15 
---> 16 net.load(model_path)
     17 
     18 predictor = create_mobilenetv2_ssd_lite_predictor(net, candidate_size=200, nms_method="soft")

1 frames
[/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py](https://localhost:8080/#) in load_state_dict(self, state_dict, strict)
   1481         if len(error_msgs) > 0:
   1482             raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
-> 1483                                self.__class__.__name__, "\n\t".join(error_msgs)))
   1484         return _IncompatibleKeys(missing_keys, unexpected_keys)
   1485 

RuntimeError: Error(s) in loading state_dict for SSD:
	size mismatch for classification_headers.0.3.weight: copying a param with shape torch.Size([126, 576, 1, 1]) from checkpoint, the shape in current model is torch.Size([66, 576, 1, 1]).
	size mismatch for classification_headers.0.3.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([66]).
	size mismatch for classification_headers.1.3.weight: copying a param with shape torch.Size([126, 1280, 1, 1]) from checkpoint, the shape in current model is torch.Size([66, 1280, 1, 1]).
	size mismatch for classification_headers.1.3.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([66]).
	size mismatch for classification_headers.2.3.weight: copying a param with shape torch.Size([126, 512, 1, 1]) from checkpoint, the shape in current model is torch.Size([66, 512, 1, 1]).
	size mismatch for classification_headers.2.3.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([66]).
	size mismatch for classification_headers.3.3.weight: copying a param with shape torch.Size([126, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([66, 256, 1, 1]).
	size mismatch for classification_headers.3.3.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([66]).
	size mismatch for classification_headers.4.3.weight: copying a param with shape torch.Size([126, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([66, 256, 1, 1]).
	size mismatch for classification_headers.4.3.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([66]).
	size mismatch for classification_headers.5.weight: copying a param with shape torch.Size([126, 64, 1, 1]) from checkpoint, the shape in current model is torch.Size([66, 64, 1, 1]).
	size mismatch for classification_headers.5.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([66]).

Do you have an idea how to correct this ? thanks

UcefMountacer avatar Mar 26 '22 23:03 UcefMountacer

image if you want to show two classes you can try like this

malatang20001210 avatar Nov 01 '23 03:11 malatang20001210