keras-attention icon indicating copy to clipboard operation
keras-attention copied to clipboard

how to use pre-trained word embedding

Open xiaotongshi opened this issue 5 years ago • 0 comments

model = Sequential()
model.add(Embedding(vocab_size, VOCAB_REP_DIM, input_length=WINDOW_SIZE, weights=[embedding_matrix]))
model.add(Bidirectional(LSTM(HIDDEN_DIM, return_sequences=True)))
model.add(AttentionDecoder(HIDDEN_DIM, vocab_size))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['acc'])
model.summary()

I would like to use pre-trained word2vec embedding. vocab_size = 149 VOCAB_REP_DIM = 100 WINDOW_SIZE = 10

But I got this error ValueError: Error when checking input: expected embedding_1_input to have 2 dimensions, but got array with shape (152548, 10, 149)

Anyone know how to use pre-trained word embedding here? Thanks in advance

xiaotongshi avatar Sep 02 '19 20:09 xiaotongshi