CTC-OptimizedLoss icon indicating copy to clipboard operation
CTC-OptimizedLoss copied to clipboard

CTCMWER Loss may be negative sometime ?

Open sheirving opened this issue 4 years ago • 3 comments

Thankyou very much for share the code, but when I training a model using the CTCMWER loss, it may be negative, sometime? So what's the reason could be?

sheirving avatar Jul 14 '21 08:07 sheirving

Thankyou very much for share the code, but when I training a model using the CTCMWER loss, it may be negative, sometime? So what's the reason could be?

https://github.com/TeaPoly/CTC-OptimizedLoss/blob/c5b0c17ce5134a6e45fed1cad1a02906932d944d/mwer_loss.py#L75

It maybe negative when re-normalized error word number.

TeaPoly avatar Jul 19 '21 07:07 TeaPoly

Hi, thanks for sharing. Is it normal for a negative loss?

zhhuang93 avatar Dec 08 '22 20:12 zhhuang93

Hi, thanks for sharing. Is it normal for a negative loss?

It is normal to observe that the loss value is getting smaller and smaller, because the average word error is subtracted when normalizing. For details, please refer to the following formula from paper https://arxiv.org/abs/1712.01818. In addition, don't use MWER loss in the initial stage of training, and try to use CTC loss for initial training.

截屏2022-12-09 09 15 21

TeaPoly avatar Dec 09 '22 01:12 TeaPoly