OpenNRE-PyTorch
OpenNRE-PyTorch copied to clipboard
about word_embedding
Source code use nn.Embedding to load pretrained word embedding
nn.Embedding(self.config.data_word_vec.shape[0], self.config.data_word_vec.shape[1])
self.word_embedding.weight.data.copy_(torch.from_numpy(self.config.data_word_vec))
actually optimizer will update both Embedding layer. Is that a bug or just on propose?