scnn_pytorch
scnn_pytorch copied to clipboard
Could you provide your pretrained model?
Hello, recently I want to do some experiments on this repo, could you provide your pretrained model VGG16.pth? Thanks!
I download the pretrained model from https://s3-us-west-2.amazonaws.com/jcjohns-models/vgg16-00b39a1b.pth. You can also download the version provided by PyTorch from https://download.pytorch.org/models/vgg16-397923af.pth. After that, check the detailed keys and values in .pth file and convert keys into the format as defined in our model file. This is the sample code I've used.
import torch
import collections
model1 = torch.load('vgg16-397923af.pth')
model2 = collections.OrderedDict()
model2['conv1_1.weight'] = model1['features.0.weight']
model2['conv1_2.weight'] = model1['features.2.weight']
model2['conv2_1.weight'] = model1['features.5.weight']
model2['conv2_2.weight'] = model1['features.7.weight']
model2['conv3_1.weight'] = model1['features.10.weight']
model2['conv3_2.weight'] = model1['features.12.weight']
model2['conv3_3.weight'] = model1['features.14.weight']
model2['conv4_1.weight'] = model1['features.17.weight']
model2['conv4_2.weight'] = model1['features.19.weight']
model2['conv4_3.weight'] = model1['features.21.weight']
model2['conv5_1.weight'] = model1['features.24.weight']
model2['conv5_2.weight'] = model1['features.26.weight']
model2['conv5_3.weight'] = model1['features.28.weight']
torch.save(model2, 'VGG16.pth')
@jcdubron Thanks for you reply! And I find that there are some images that don't have the lanes in the dataset, so if it is necessary to clean the dataset before training? If you could provide your train.txt, it would be better! Thanks very much!
@WBinke If I recall correctly, there is no need to modify train.txt because the existence of each lane has been labelled in the dataset.
I download the pretrained model from https://s3-us-west-2.amazonaws.com/jcjohns-models/vgg16-00b39a1b.pth. You can also download the version provided by PyTorch from https://download.pytorch.org/models/vgg16-397923af.pth. After that, check the detailed keys and values in .pth file and convert keys into the format as defined in our model file. This is the sample code I've used.
import torch import collections model1 = torch.load('vgg16-397923af.pth') model2 = collections.OrderedDict() model2['conv1_1.weight'] = model1['features.0.weight'] model2['conv1_2.weight'] = model1['features.2.weight'] model2['conv2_1.weight'] = model1['features.5.weight'] model2['conv2_2.weight'] = model1['features.7.weight'] model2['conv3_1.weight'] = model1['features.10.weight'] model2['conv3_2.weight'] = model1['features.12.weight'] model2['conv3_3.weight'] = model1['features.14.weight'] model2['conv4_1.weight'] = model1['features.17.weight'] model2['conv4_2.weight'] = model1['features.19.weight'] model2['conv4_3.weight'] = model1['features.21.weight'] model2['conv5_1.weight'] = model1['features.24.weight'] model2['conv5_2.weight'] = model1['features.26.weight'] model2['conv5_3.weight'] = model1['features.28.weight'] torch.save(model2, 'VGG16.pth')
hi,jcdubron only convert conv2d.weight keys? bn layer.running_mean key not need convert?
You can have a try.
@Cverlpeng Could you please provide the code which is to convert init weights with bn layer?