pflowtts_pytorch
pflowtts_pytorch copied to clipboard
Jump in sub_loss/train_dur_loss_step
Hi @p0p4k ,
I hope this message finds you well. I am currently working on training the pflowtts model with my own dataset and have encountered an unexpected behavior that I'm hoping to get some assistance with.
During training, I've observed significant jumps in the sub_loss/train_dur_loss_step metric, as illustrated in the screenshot below:
I have followed the recommended setup and training guidelines, but I am unsure what might be causing these fluctuations. Here are some details about my training configuration and dataset:
batch_size: 64
n_spks: 1
...
data_statistics:
mel_mean: -6.489412784576416
mel_std: 2.281172275543213
I would greatly appreciate it if you could provide any insights or suggestions that might help resolve this issue. Perhaps there are known factors that could lead to such behavior or additional steps I could take to stabilize the training loss?