ELMoForManyLangs icon indicating copy to clipboard operation
ELMoForManyLangs copied to clipboard

How to use elmo embedding instead of nn.Embedding?

Open pikaliov opened this issue 5 years ago • 0 comments

Hi, I can't get how to use elmo embedding instead of word2vec in code. Can you guide me or link example code? Thanks.

embedding = torch.from_numpy(embedding).float()
# size
self.embedding = nn.Embedding(embedding.shape[0], embedding.shape[1])
# init_weights
self.embedding.weight = nn.Parameter(embedding)

pikaliov avatar May 23 '19 15:05 pikaliov