Aggregation-Cross-Entropy
Aggregation-Cross-Entropy copied to clipboard
Aggregation Cross-Entropy for Sequence Recognition. CVPR 2019.
I train a model(CRNN) base on dataset synth90k, through the loss decline step by step, the accuracy is near to 0 all the time. What casue this problem?
I reproduced crnn+ctc and test it on IIIT5K+SVT+IC03+IC13 test database, got WER 0.153, which is same as the reported results in paper. I also reproduced crnn+ace loss, but only got...
I pretrained a model using ctcloss and it works well. Then I loaded the weights and continued to train with the aceloss. The losses seemed to be coming down, but...
Hi, have you experimented on HWDB 2.0-2.2, could you share your results for ACE? Thanks
I use the ACE loss function to do English handwriting recognition. When the model is trained, it does not converge, but with CTC, it gets a good convergence effect. How...
Very nice work, can this method combine with CTC in 1D case to improve performace further? does it conflict with CTC when training?
对general loss function为何可以通过第3节的公式(2)来估计?估计公式中的每一项概率不是远大于general loss function中的对应项吗?
in https://github.com/summerlvsong/Aggregation-Cross-Entropy/blob/master/source/models/seq_module.py#L65, should it be pred_string_set = [pred_string[i:i+self.w] for i in xrange(0, len(pred_string), self.w)] instetad of self.w*2? please verify. Thanks
As the model can only recognize the characters and characters' number, so what's the accuracy criterion for 2D prediction in the paper?