End-to-end-ASR-Pytorch icon indicating copy to clipboard operation
End-to-end-ASR-Pytorch copied to clipboard

Very different loss on validation

Open kamilkk852 opened this issue 5 years ago • 1 comments

Just for testing I'm trying to overfit a very small dataset and I've set validation dataset to be the same as the training one, but I get very different loss progression for these stages. On training set it is constantly decreasing, but on validation after a few epochs it starts to increase. I do not use dropout. Shouldn't it be roughly the same?

kamilkk852 avatar Jun 26 '19 09:06 kamilkk852

Hi @kamilkk852

In my experience, validation loss typically start growing within a short period. You should evaluate your model base on validation error rate instead. Also, using A LOT of data for training is one nature of end-to-end ASRs, they usually performed poorly on small corpus.

Alexander-H-Liu avatar Oct 29 '19 09:10 Alexander-H-Liu