deep-text-recognition-benchmark
deep-text-recognition-benchmark copied to clipboard
Can't fine-tune on my custom dataset
I am trying to fine-tune pretrained easyocr model english_g2.pth ( that i found in local system ./EasyOCR/model ). I need fine tune just digits ( 0...9 ). And i get error above while i do command with --save_model option : python3 train.py --train_data lmdb_output_train --valid_data lmdb_output_val --select_data "/" --batch_ratio 1.0 --Transformation TPS --FeatureExtraction ResNet --SequenceModeling BiLSTM --Prediction Attn --batch_size 2 --data_filtering_off --workers 0 --batch_max_length 3 --num_iter 50 --valInterval 5 --saved_model english_g2.pth
But if i do without --save_model ( train from scratch ) everything is well.
RuntimeError: Error(s) in loading state_dict for DataParallel: Missing key(s) in state_dict: "module.Transformation.LocalizationNetwork.conv.0.weight" .....
and the second part of the error
Unexpected key(s) in state_dict: "module.FeatureExtraction.ConvNet.0.weight", "module.FeatureExtraction.ConvNet.0.bias", "module.FeatureExtraction.ConvNet.3.weight", "module.FeatureExtraction.ConvNet.3.bias", "module.FeatureExtraction.ConvNet.6.weight", "module.FeatureExtraction.ConvNet.6.bias", "module.FeatureExtraction.ConvNet.8.weight", "module.FeatureExtraction.ConvNet.8.bias", "module.FeatureExtraction.ConvNet.11.weight", "module.FeatureExtraction.ConvNet.12.weight", "module.FeatureExtraction.ConvNet.12.bias", "module.FeatureExtraction.ConvNet.12.running_mean", "module.FeatureExtraction.ConvNet.12.running_var", "module.FeatureExtraction.ConvNet.12.num_batches_tracked", "module.FeatureExtraction.ConvNet.14.weight", "module.FeatureExtraction.ConvNet.15.weight", "module.FeatureExtraction.ConvNet.15.bias", "module.FeatureExtraction.ConvNet.15.running_mean", "module.FeatureExtraction.ConvNet.15.running_var", "module.FeatureExtraction.ConvNet.15.num_batches_tracked", "module.FeatureExtraction.ConvNet.18.weight", "module.FeatureExtraction.ConvNet.18.bias", "module.Prediction.weight", "module.Prediction.bias". size mismatch for module.SequenceModeling.0.rnn.weight_ih_l0: copying a param with shape torch.Size([1024, 256]) from checkpoint, the shape in current model is torch.Size([1024, 512]). size mismatch for module.SequenceModeling.0.rnn.weight_ih_l0_reverse: copying a param with shape torch.Size([1024, 256]) from checkpoint, the shape in current model is torch.Size([1024, 512]).
Please help me if you can?