Shinji Watanabe
Shinji Watanabe
LGTM. If @siddhu001 is OK, I'll merge this.
Thanks a lot, @akreal! This is really great!
Year, that sounds very useful option. One of the most frequent questions is actually about it. However, this is also useful in general since it lets users know that the...
> @sw005320 I think dividing to each direction is quite inefficient in this case due to the storage issue (this data consumes a few TB). I see but is there...
Thanks for the report. @simpleoier, could you check it?
Thanks for your report! > 1. I thought of checking out the streaming ASR for that instance, but unfortunately I couldn't find any pre-trained model of the [Streaming Transformer](https://arxiv.org/pdf/2006.14941.pdf) ASR...
[st.sh] Removing utterance IDs leaves trailing white spaces, that further affects the detokenizer.pl
Thanks for your report. @ftshijt, can you check this?
@siddhu001, can you review this PR?
@SujaySKumar, reminder. Please follow @siddhu001’s suggestions
There are a bit too many config files. If all have some results, it is fine (but add the results). Otherwise, you can just keep a few important ones.