vrd-dsr
vrd-dsr copied to clipboard
Question about params_emb
Hi, I find the code to load params_emb is :
Initial object embedding with word2vec with open('../data/vrd/params_emb.pkl') as f: emb_init = cPickle.load(f) net.state_dict()['emb.weight'][1::].copy_(torch.from_numpy(emb_init))
But why should use [1::]?
At first, 0 represents background.
You can just use net.state_dict()['emb.weight'].copy_(torch.from_numpy(emb_init))
Thank you! Actually I meet the error when run the code: RuntimeError: The expanded size of the tensor (99) must match the existing size (100) at non-singleton dimension 0
And, what does the "0 represents background" mean? Could I understand it through some paper or blog?
Background is not included in this project. You can just ignore it.
OK, Thank you!