Transformer-TTS icon indicating copy to clipboard operation
Transformer-TTS copied to clipboard

A Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"

Results 37 Transformer-TTS issues
Sort by recently updated
recently updated
newest added

I just want to know "Why we are using ARPABET letters ?" also "Is that given pre-trained data is trained with ARPABET characters?" Because when I tried to pass the...

help wanted

Hi, Can you clearify what you exactly mean when you say .. **"It was very important to concatenate the input and context vectors in the Attention mechanism."** Also could you...

Thanks for your great work! I ran your code and it works well. Btw, I use a dynamic "max_len" instead of 400 when I synthesis the speech. But it has...

why the speed of synthesis is very slow?how to slove it? Need some help! Thanks!

May I know how to plot the attention figures as shown in the ./png folder? I can only have attention figures with black-white colors in tensorboard.

Hello, Firstly, thanks for the implementation. I used the pre-trained model and ran the synthesis.py file. I saw that the input text is "Transformer model is so fast!" while the...

When I tried to train my own transformer, I found the decoder is too powerful so that it was capable of generating spectrogram using almost no context information. Do you...

In your code, you don't use stop_token to compute loss, why?

I notice that you added stop token in the model, but didn't calculate the stop_token in the loss. I tried to add stop_token myself, but then the model no longer...

help wanted

Does there any plan to support chinese? How about the perfomrmance compared to Tacotron