nurtas-m
nurtas-m
@michem32, Ok, i fixed some bugs. Can you, please, clone the latest version of g2p-seq2seq (ver 6.1.1a0), and try to retrain a new model?
Hello, @michem32 Do you train the new model before launching "--interactive" mode? The model that available by [this](https://sourceforge.net/projects/cmusphinx/files/G2P%20Models/g2p-seq2seq-cmudict.tar.gz/download) link is now outdated. We are training new models now, and will...
> If I want to continue training a saved model I have to write: g2p-seq2seq --train cmudict.dict --model_dir model_folder_path And, if I want to start training from scratch, I’ ll...
Hello, @ellurunaresh Please, clone the latest version of g2p-seq2seq (6.2.0a0). Also, it is required tensorflow=>1.5.0
In that case, can you, please, install tensorflow=1.5.0 only for your user (with "--user" flag: pip install tensorflow==1.5.0 --user) ?
Hello, @ellurunaresh > 1. How to handle "_UNK" during decoding. Is there any option to set the parameter so that it could take any nearest string? 1. If you work...
Hello, @loretoparisi What files you cannot find in your dict folder? Do you mean, that there were some of your dictionary files, and after the training was ended some of...
When you launch the training mode, the program creates a new model directory that you set up for the "--model_dir" flag (in your case it will be ~/docker/g2p-seq2seq/data/models/cmudict-ipa directory). If...
If you have just one file and don't want to split it to train, development and test sets by yourself, you need to launch the program without any "--valid" and...
Also, if you set up "--cleanup" flag during training, the program will create cleaned up files without stress and comments: cmudict.dict.part.train.cleanup cmudict.dict.part.dev.cleanup cmudict.dict.part.test.cleanup So, if you have an initial dictionary...