HCCF
HCCF copied to clipboard
代码与论文中描述似乎不一致
1.HGNN层,似乎没有用到leakyReLU,但是论文中这里说用到了激活函数。
class HGNNLayer(nn.Module):
def __init__(self):
super(HGNNLayer, self).__init__()
self.act = nn.LeakyReLU(negative_slope=args.leaky)
def forward(self, adj, embeds):
# lat = self.act(adj.T @ embeds)
# ret = self.act(adj @ lat)
lat = (adj.T @ embeds)
ret = (adj @ lat)
return ret
2.bpr损失函数,论文中的公式似乎是被注释掉的这行代码,而未被注释的似乎是另外一种形式的损失,好像没在论文里找到,
scoreDiff = pairPredict(ancEmbeds, posEmbeds, negEmbeds)
bprLoss = - (scoreDiff).sigmoid().log().mean()
# bprLoss = t.maximum(t.zeros_like(scoreDiff), 1 - scoreDiff).mean() * 40
3.最终预测使用trnMask的意义是什么,论文中似乎并未提及,可否直接embedding相乘呢
for usr, trnMask in tstLoader:
i += 1
usr = usr.long().cuda()
trnMask = trnMask.cuda()
usrEmbeds, itmEmbeds = self.model.predict(self.handler.torchBiAdj)
allPreds = t.mm(usrEmbeds[usr], t.transpose(itmEmbeds, 1, 0)) * (1 - trnMask) - trnMask * 1e8
_, topLocs = t.topk(allPreds, args.topk)
recall, ndcg = self.calcRes(topLocs.cpu().numpy(), self.handler.tstLoader.dataset.tstLocs, usr)
epRecall += recall
epNdcg += ndcg
log('Steps %d/%d: recall = %.2f, ndcg = %.2f ' % (i, steps, recall, ndcg), save=False, oneline=True)