Rafael Valle
Rafael Valle
Add these two lines before the `import matplotlib.pylab as plt` line. ``` import matplotlib matplotlib.use("Agg") ```
https://github.com/openai/improved-gan/blob/master/mnist_svhn_cifar10/nn.py
your issue can be related to the pytorch version. make sure you're running the latest version and try inference in fp32.
Train a model with 1 step of flow first. Then use this model to warm-start a model with 2 steps of flow.
The validation loss for your 1-step of flow model is starting to plateau. Use this model to warm-start a 2-steps of flow model. I assume the validation loss will go...
@adrianastan if you trained a model with speaker embeddings, what happens if do this: ```flowtron.infer(flowtron.forward(audio, speaker), other_speaker)```
@astricks thank you for letting us know cleaning the data has helped the model learn attention. Can you please share how the attention looks at 200k iterations?
Great. I suggest resuming from the model with 200k iters given that it has better generalization loss and less bias in the attention map.
Training on what language? Did you try warm-starting from the pre-trained model? Trimming silences from the beginning and end of audio files helps with learning attention.
Were you able to train the same data on Tacotron before?