Bo Shao

Results 10 comments of Bo Shao

The weights trained are more sensitive to the question length than any words in the question. For example, if you had a training record like: Q: Something A: My answer....

You can find very well-organized and cleaned conversational dataset (about 160K pairs) for training a chatbot here: https://github.com/bshao001/ChatLearner. That repository also contains scripts and instructions to preprocess reddit data in...

No. I am not working on or plan to support this project any longer. If you seriously plan to develop a chatbot, I suggest you use Transformer model instead of...

It is very likely that your data is having some kind of problem, but I cannot tell without looking at the details. You can google around, and I believe I...

Roughly yes, although GRU is used in the implementation.

Yes. Attention mech is involved in this NMT model.

@hamedkhanpour Try: from tensorflow.python.ops import rnn_cell_impl linear = rnn_cell_impl._linear

I am trying to incorporate my_seq2seq.py into my chatbot in tensorflow 1.2, and it does not work. My changes were involved, but still no luck.

I was thinking to perform left padding at training time as well. But in most cases, we may not need batch inference, which does not need padding at all, which...

Thanks for your quick response. I will give it a try when the model trained with a much larger dataset is ready.