Tacotron-2
Tacotron-2 copied to clipboard
Loss Exploded?
When I train Wavenet with LJspeech , it always occurred that "Loss Exploded" even if I continue to train with the checkpoint. I just use the master code and don't know why it happened. Any one know it and help me?
I have the same problem as yours,and the loss is very unsteady.Hopefully someone can figure it out
There is a similar issue when training on LibriTTS
I'm training on the real mel-spectrogram, and without global conditioning, no GTA
I changed input_type to "mulaw-quantize", quantize_channels and out_channels to 256, and redid GTA and preprocess. Now the loss exploded problem is gone. But the eval result is still very bad after 115K steps.
(I'm working on Chinese language)
I'm experiencing the same thing, any news on this?
I've actually found that if you revert to the one checkpoint before you got stuck and the loss exploded every time you re-run training, the old checkpoint will continue without a problem - this might have to be done every time you get stuck continuously.
In my case, I just changed the numbers in the checkpoint
file and deleted the most recent checkpoint data and index files that caused the stuck.