quora_duplicate
quora_duplicate copied to clipboard
config doesn't have any attribute called WORD_EMBEDDING_DIM or MAX_NB_WORDS
When use decomposable_attention for training model it throws exception because there is no attribute in config called WORD_EMBEDDING_DIM. I think you should change this to following:
nb_words = min(TrainConfig.MAX_NB_WORDS, len(word_index)) + 1
embedding_matrix = np.zeros((nb_words, TrainConfig.WORD_EMBEDDING_DIM))
for word, i in word_index.items():
if word in word2vec.vocab:
embedding_matrix[i] = word2vec.word_vec(word)
print('Null word embeddings: %d' % np.sum(np.sum(embedding_matrix, axis=1) == 0))