memn2n icon indicating copy to clipboard operation
memn2n copied to clipboard

End-To-End Memory Network using Tensorflow

Results 11 memn2n issues
Sort by recently updated
recently updated
newest added

with the test intention that ``` >>> tokenize('Bob dropped the apple. Where is the apple?') ['Bob', 'dropped', 'the', 'apple', '.', 'Where', 'is', 'the', 'apple', '?'] ``` we should write like...

n_train/20, n_val/20, and n_test/20 cause errors in python3. I modified n_train/20 -> n_train//20 n_val/20 -> n_val//20 n_test/20 -> n_test//20 and it works

On this [line](https://github.com/domluna/memn2n/blob/8a4915ac18e95db3828f3ad862609718c28a17e8/memn2n/memn2n.py#L76), it is mentioned there is not support for jagged arrays, but the new Tensorflow v2.1.0 has introduced [RaggedTensor](https://www.tensorflow.org/api_docs/python/tf/RaggedTensor). It would be nice if support for this feature...

m_C = tf.reduce_sum(m_emb_C * self._encoding, 2) c_temp = tf.transpose(m_C, [0, 2, 1]) Here in this part, the first line with reduce_sum should turn the matrix into 2-dimension, so I think...

# nonlinearity if self._nonlin: u_k = nonlin(u_k) u.append(u_k) Unresolved reference nonlin,how to fix it

https://nlp.stanford.edu/blog/a-new-multi-turn-multi-domain-task-oriented-dialogue-dataset/

Hi! I have a question, can this model be used for the Dialog tasks? My main concern is that Dialog tasks assume working in seq2seq mode, and I'm not sure...

I am trying to see the memory slot probabilities(probabilities associated with different sentences) for a particular query. Is there a way to visualize them ? Please help. Thanks, Joe

enhancement
help wanted

Assuming I could define a custom gradient for the nil embedding, the memory a.k.a variables A, B, TA, and TB can be in a separate memory component. The main benefit...

enhancement