Seq2seq-Chatbot-for-Keras icon indicating copy to clipboard operation
Seq2seq-Chatbot-for-Keras copied to clipboard

This repository contains a new generative model of chatbot based on seq2seq modeling.

Results 11 Seq2seq-Chatbot-for-Keras issues
Sort by recently updated
recently updated
newest added

can you please explain why this error is occurring? Traceback (most recent call last): File "conversation.py", line 215, in Q = tokenize(query) File "conversation.py", line 117, in tokenize X =...

Could you please explain how i can give my custom data with different conversation flow. How each conversation can be given in a single text.

``` Starting the model... Traceback (most recent call last): File "c:\...\conversation.py", line 156, in out = Dense(dictionary_size/2, activation="relu", name='relu activation')(me rge_layer) File "c:\...\venv\lib\site-packages\keras\engine\base_layer.py", line 463, in __call__ self.build(unpack_singleton(input_shapes)) File "c:\...\venv\lib\site-packages\keras\layers\core.py",...

Traceback (most recent call last): File "D:\Seq2seq-Chatbot-for-Keras-master\conversation.py", line 151, in out = Dense(dictionary_size/2, activation="relu", name='relu-activation')(merge_layer) File "C:\Users\sheen\AppData\Local\Programs\Python\Python36\lib\site-packages\keras\engine\base_layer.py", line 431, in __call__ self.build(unpack_singleton(input_shapes)) File "C:\Users\sheen\AppData\Local\Programs\Python\Python36\lib\site-packages\keras\layers\core.py", line 866, in build constraint=self.kernel_constraint) File...

Not a big deal, but Ubuntu 16.04 Python 2.7 0 does not like the spaces in the names. Just replace the spaces with a - and all works well. Example:...

In conversation.py ` Shared_Embedding = Embedding(output_dim=word_embedding_size, input_dim=dictionary_size, weights=[embedding_matrix], input_length=maxlen_input, name='Shared') ` however it produces the error above, should we reimport embeddings?

What can be the root cause for this? ``` for m in range(Epochs): # Loop over training batches due to memory constraints: for n in range(0,round_exem,step): q2 = q[n:n+step] print(q2)...

Hi , i just had a question regarding the parameters of the model, did you put them imperially or did you use an optimization technique to choose the parameters of...

Thanks for your git, which gives me a lot of inspiration. To my best knowledge, the attention or pointer mechanism is popular in sequence to sequence tasks such as chatbot....