ELMoForManyLangs
ELMoForManyLangs copied to clipboard
How to use elmo embedding instead of nn.Embedding?
Hi, I can't get how to use elmo embedding instead of word2vec in code. Can you guide me or link example code? Thanks.
embedding = torch.from_numpy(embedding).float()
# size
self.embedding = nn.Embedding(embedding.shape[0], embedding.shape[1])
# init_weights
self.embedding.weight = nn.Parameter(embedding)