TernausNet
TernausNet copied to clipboard
Are the pertained weights frozen while training?
Hi! Great work! I just wanted to inquire in more detail whether, while training, you're freezing the old weights. Thanks!
of course no
I think the pretrained weights should be frozen (at least at first) or some of the advantage they give you will be erased. For instance, the images below show training/validation loss curves (on some small segmentation dataset) for three different scenarios. The best performance was given by freezing the pretrained knowledge.
-
Frozen weights on pretrained vgg11 encoder:

-
Unfrozen weights on pretrained encoder:

-
Randomly initialized (pretrained=False):

Has the author already provided pretrained weights?