about rnn_train.py
Hi, I have some problems about rnn_train.py, when I follow the README and try to run rnn_train.py, it mentioned that four arguments,--train_filelist_path, --dev_filelist_path, --out_dir, --config, are required. How to set these four arguments? For example, train_filelist_path is the path of a txt file which includes each filepath of the raw audio files I want to train? dev_filelist_path is the path of a txt file which includes each filepath of the dev raw audio files? out_dir is just an empty dir? And how to set config?
Another problem is when I follow the README, after I execute mkdir bin && cd bin and then make .., a warning message is shown------->[WARNING]nnet_data.cpp is not exist. Do not generate inference executable... Whether this situation needs to be resolved?
Thank you for your help
there is kaldi style utils/run.sh file that can pass arguments to rnn_train.py it depends on your parameter in run.sh
I would like to recommend to do this way instead of passing parameter manually.
You should generate datasets with DNS-Challenge synthesizer before you run this script
Hi, I have some problems about rnn_train.py, when I follow the README and try to run rnn_train.py, it mentioned that four arguments,--train_filelist_path, --dev_filelist_path, --out_dir, --config, are required. How to set these four arguments? For example, train_filelist_path is the path of a txt file which includes each filepath of the raw audio files I want to train? dev_filelist_path is the path of a txt file which includes each filepath of the dev raw audio files? out_dir is just an empty dir? And how to set config?
Another problem is when I follow the README, after I execute mkdir bin && cd bin and then make .., a warning message is shown------->[WARNING]nnet_data.cpp is not exist. Do not generate inference executable... Whether this situation needs to be resolved?
Thank you for your help
the "nnet_data.cpp" file is used for inference project, once you got a pretrained model, the nnet_data.cpp can be generated by python dump_percepnet.py model.pt, and then, you can CMake again, to make the inference project.