CRAFT-Reimplementation
CRAFT-Reimplementation copied to clipboard
when and when not to freeze
You freeze vgg16_bn weights in trainic15data.py but not in trainSyndata.py.
- Why do you freeze in
trainic15data.pyand not intrainSyndata.py? - What is the benefit of freezing? I assumed we should ALWAYS let gradients flow through both
vgg16andCRAFT. - Have you trained
vgg16withCRAFTstarting from Pytorch's pretrained weights? If so, how was the performence? - saving model like this:
net = CRAFT()
net.load_state_dict(copyStateDict(torch.load('CRAFT-Reimplemetation/pretrain/SynthText.pth')))
torch.save(net.state_dict(), os.path.join("pretrain","test.pth"))
only saves weights of CRAFT and not vgg16_bn. If I were to train both CRAFT and vgg16, both unfrozen, how would I save both CRAFT and vgg16's network together in a single model?
@ThisIsIsaac The author freeze the model to train the linkRefiner (see the paper P11 and P12). When you train my reimplementation-model that you do not need freeze anything.
@ThisIsIsaac I've changed the mistake. Em... I think you can train for MLT. Maybe IC15 train has some mistakes. I will refine it when I am no busy.
I've changed the mistake.
what mistake are you referring to?
Maybe IC15 train has some mistakes. I will refine it when I am no busy.
Maybe I can help out. I've also fixed some critical errors and done some cleanup. I would like to send a PR soon.
@ThisIsIsaac In the trainic15data.py, code in line 114 should be freeze=False. Sorry, I do not know what's the meaning of PR.
I do not know what's the meaning of PR.
PR stands for pull request
In the trainic15data.py, code in line 114 should be freeze=False
will apply that to my copy of the code. thanks