pflowtts_pytorch icon indicating copy to clipboard operation
pflowtts_pytorch copied to clipboard

Jump in sub_loss/train_dur_loss_step

Open vn09 opened this issue 1 year ago • 6 comments

Hi @p0p4k ,

I hope this message finds you well. I am currently working on training the pflowtts model with my own dataset and have encountered an unexpected behavior that I'm hoping to get some assistance with.

During training, I've observed significant jumps in the sub_loss/train_dur_loss_step metric, as illustrated in the screenshot below:

Screenshot 2023-12-04 at 17 54 38

I have followed the recommended setup and training guidelines, but I am unsure what might be causing these fluctuations. Here are some details about my training configuration and dataset:

   batch_size: 64
   n_spks: 1
   ...
  data_statistics:
    mel_mean: -6.489412784576416
    mel_std: 2.281172275543213

I would greatly appreciate it if you could provide any insights or suggestions that might help resolve this issue. Perhaps there are known factors that could lead to such behavior or additional steps I could take to stabilize the training loss?

vn09 avatar Dec 04 '23 11:12 vn09