FOTS_TF
FOTS_TF copied to clipboard
Hyperparameters when training on SynthText800?
Hello, when you train on SynthText800 from pretrained model Resnet-50, what's your hyperparamets setting?
I maybe refer the paper textspotter. I'm sorry I forgot a little.
I maybe refer the paper textspotter. I'm sorry I forgot a little.
Thank you very much, I'll refer to this paper.
I maybe refer the paper textspotter. I'm sorry I forgot a little.
Hello, In your code, the train of recognition branch. It seems that the training is still end-to-end. The first line of compute gradients should be uncomment, and the second line of compute gradients should be comment? ` if FLAGS.train_stage == 1:
print("Train recognition branch only!")
recog_vars = tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope='recog')
#grads = opt.compute_gradients(total_loss, recog_vars)
grads = opt.compute_gradients(total_loss)`
when I use "grads = opt.compute_gradients(total_loss, recog_vars)", the training time is about 26s per step, while when I train the network end-to-end, the training time is just 14s per step. Do you know the reason? Thank you.