TernausNet icon indicating copy to clipboard operation
TernausNet copied to clipboard

Are the pertained weights frozen while training?

Open prateekmalhotra-hover opened this issue 6 years ago • 3 comments

Hi! Great work! I just wanted to inquire in more detail whether, while training, you're freezing the old weights. Thanks!

prateekmalhotra-hover avatar Jul 23 '19 20:07 prateekmalhotra-hover

of course no

iperov avatar Aug 31 '19 13:08 iperov

I think the pretrained weights should be frozen (at least at first) or some of the advantage they give you will be erased. For instance, the images below show training/validation loss curves (on some small segmentation dataset) for three different scenarios. The best performance was given by freezing the pretrained knowledge.

  1. Frozen weights on pretrained vgg11 encoder: ternaus_frozen

  2. Unfrozen weights on pretrained encoder: DRIVE_ternausnet_loss_curves

  3. Randomly initialized (pretrained=False): DRIVE_ternausnet_loss_curves_pt_false

godsmokescrack avatar Feb 23 '21 13:02 godsmokescrack

Has the author already provided pretrained weights?

rrryan2016 avatar Sep 17 '21 12:09 rrryan2016