Aggregation-Cross-Entropy icon indicating copy to clipboard operation
Aggregation-Cross-Entropy copied to clipboard

Aggregation Cross-Entropy for Sequence Recognition. CVPR 2019.

Results 19 Aggregation-Cross-Entropy issues
Sort by recently updated
recently updated
newest added

I recreated your project and found that the input GT was converted into a word list, which had lost its order, and your prediction only provided the number of characters....

The result of training with fixed length (32 * 280) (the same number of characters (10)) is only good for short text.

In the paper in https://arxiv.org/pdf/1904.08364.pdf sec 3.2 it is mentioned: "We borrow the concept of cross-entropy from information theory, which is designed to measure the “distance” between two probability distributions."...

3C-FCRN+B_SLD+SLD(residual LSTM proposed)get ICDAR CR 97.15 AR 96.50 But ACE just CR 96.70 AR 96.22 so do ACE work in HCTR or not residual LSTM?

result is n*7*7*26 ,how to match word

Hello, I am writing this topic to ask if it will be useful in speech recognition tasks. I am going to test your ACE loss on my acoustic model. Hope...