vrd-dsr icon indicating copy to clipboard operation
vrd-dsr copied to clipboard

Question about params_emb

Open fangfchen opened this issue 6 years ago • 4 comments

Hi, I find the code to load params_emb is :

Initial object embedding with word2vec with open('../data/vrd/params_emb.pkl') as f: emb_init = cPickle.load(f) net.state_dict()['emb.weight'][1::].copy_(torch.from_numpy(emb_init))

But why should use [1::]?

fangfchen avatar Dec 10 '18 13:12 fangfchen

At first, 0 represents background. You can just use net.state_dict()['emb.weight'].copy_(torch.from_numpy(emb_init))

GriffinLiang avatar Dec 10 '18 13:12 GriffinLiang

Thank you! Actually I meet the error when run the code: RuntimeError: The expanded size of the tensor (99) must match the existing size (100) at non-singleton dimension 0

And, what does the "0 represents background" mean? Could I understand it through some paper or blog?

fangfchen avatar Dec 10 '18 13:12 fangfchen

Background is not included in this project. You can just ignore it.

GriffinLiang avatar Dec 10 '18 13:12 GriffinLiang

OK, Thank you!

fangfchen avatar Dec 10 '18 13:12 fangfchen