FOTS_TF icon indicating copy to clipboard operation
FOTS_TF copied to clipboard

Hyperparameters when training on SynthText800?

Open LizzieOneDay opened this issue 5 years ago • 3 comments

Hello, when you train on SynthText800 from pretrained model Resnet-50, what's your hyperparamets setting?

LizzieOneDay avatar Dec 18 '19 05:12 LizzieOneDay

I maybe refer the paper textspotter. I'm sorry I forgot a little.

Pay20Y avatar Jan 11 '20 03:01 Pay20Y

I maybe refer the paper textspotter. I'm sorry I forgot a little.

Thank you very much, I'll refer to this paper.

LizzieOneDay avatar Jan 15 '20 06:01 LizzieOneDay

I maybe refer the paper textspotter. I'm sorry I forgot a little.

Hello, In your code, the train of recognition branch. It seems that the training is still end-to-end. The first line of compute gradients should be uncomment, and the second line of compute gradients should be comment? ` if FLAGS.train_stage == 1:

            print("Train recognition branch only!")

            recog_vars = tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope='recog')

            #grads = opt.compute_gradients(total_loss, recog_vars)

            grads = opt.compute_gradients(total_loss)`

when I use "grads = opt.compute_gradients(total_loss, recog_vars)", the training time is about 26s per step, while when I train the network end-to-end, the training time is just 14s per step. Do you know the reason? Thank you.

LizzieOneDay avatar Jan 15 '20 08:01 LizzieOneDay