research-seq2seq-HTR
research-seq2seq-HTR copied to clipboard
Attention weights remain constant during training
Thank you very much for sharing your code.
I am trying to reproduce your results on the IAM database. The training is working fine and the loss is decreasing. However, the attention weights stay at 0 during training.
Do you have an idea of what may be causing this issue ?
Here is an example of a test image after ~20 epochs.