DeepNLP-models-Pytorch icon indicating copy to clipboard operation
DeepNLP-models-Pytorch copied to clipboard

about the negative example loss in the Skip-gram-Negative-Sampling algorithm

Open xiaopengguo opened this issue 4 years ago • 0 comments

I have learned a lot from this elegant project. Thanks a lots! Based on the equation in the Skip-gram-Negative-Sampling algorithm below, 微信图片_20210425223243

I think the negative example loss calculated by

negative_score = torch.sum(neg_embeds.bmm(center_embeds.transpose(1, 2)).squeeze(2), 1).view(negs.size(0), -1) # BxK -> Bx1 loss = self.logsigmoid(positive_score) + self.logsigmoid(negative_score)

maybe change to negative_score = neg_embeds.bmm(center_embeds.transpose(1, 2)) loss = self.logsigmoid(positive_score) + torch.sum(self.logsigmoid(negative_score), 1) since based on the equation, the negative_socre first goes through a logsigmoid operation, and then sums up.

xiaopengguo avatar Apr 25 '21 14:04 xiaopengguo