heyelin
heyelin
 two colors in a char
wordart like this:  or some special fonts like this: 
 如图文字外会留下大量空白
Instructions for updating: Please switch to tf.train.get_or_create_global_step INFO:tensorflow:Restoring parameters from /home/ucmed/opt/python/models-master/research/attention_ocr/python/logs/model.ckpt-0 INFO 2019-01-03 02:14:41.000888: tf_logging.py: 82 Restoring parameters from /home/ucmed/opt/python/models-master/research/attention_ocr/python/logs/model.ckpt-0 INFO:tensorflow:Starting Session. INFO 2019-01-03 02:14:55.000713: tf_logging.py: 82 Starting Session. INFO:tensorflow:Saving...
37更改后报错
请问如何做数据并行,训练時只用到了一个gpu
训练的时候大量消耗内存,将近20G,修改lstm的hidden layer数目,减小batch_size等都没有明显效果。请问怎么修改程序才能大量减少内存消耗呢?
你好,可以提供预训练模型吗?