Facebook-Messenger-Bot icon indicating copy to clipboard operation
Facebook-Messenger-Bot copied to clipboard

wordVectors in seq2seq

Open guoxuxu opened this issue 6 years ago • 0 comments

Hi, I have question on using wordVectors in seq2seq.py since I have to adapt to my very long sentences dataset (I saw the limited length of sentence is 15 in the createTrainingMatrices function in seq2seq.py, but my datasets are almost short paragraphs^_^). Although the createTrainingMatrices function in seq2seq.py can help to create new vectors for every sentence using index of words, why not using the pre-trained embeddingMatrix.npy produced by word2vec.py? "wordVectors = np.load('models/embeddingMatrix.npy') " has been stated in seq2seq.py, but doesn't been used actually?

guoxuxu avatar Nov 27 '18 14:11 guoxuxu