pytorch-word2vec icon indicating copy to clipboard operation
pytorch-word2vec copied to clipboard

Initialize embedding matrix problem.

Open yl-jiang opened this issue 7 years ago • 0 comments

In your code, you build the embedding matrix with parameter padding_idx=0, but the next step you initialize the embedding matrix's weight, that is mean 'padding_idx=0' do nothing, right? emb = torch.nn.Embedding(10,3,0) input = torch.LongTensor([0,2,0,5]) emb(input) Out[6]:tensor([[ 0.0000, 0.0000, 0.0000], [ 2.4883, -0.7663, -0.5142], [ 0.0000, 0.0000, 0.0000], [-0.3542, 1.0734, 0.2838]]) emb.weight.data.uniform_(-10, 10) emb(input) Out[7]: tensor([[ 2.2589, 3.7151, 1.5739], [-1.8669, -5.0047, 5.4283], [ 2.2589, 3.7151, 1.5739], [-4.6681, -2.5092, 7.6047]])

yl-jiang avatar Jul 12 '18 01:07 yl-jiang