Recurrent-Knowledge-Graph-Embedding icon indicating copy to clipboard operation
Recurrent-Knowledge-Graph-Embedding copied to clipboard

Result is worse than result of pre-embedding

Open 55TFSI opened this issue 4 years ago • 3 comments

Sorry to bother you, I ran the code of RKGE
But I found due to the reason that the number of positive samples and negative samples is unbalanced. (1 : 0.05) in your code. the loss of the model is very low. if I change to (1: 1) the loss will be higher than 0.4(per pair ).

And the result of the model pre@1, 5, 10, Mrr is 0.151, 0.108, 0.095, 0.334 respectively. But the result of the pre_embedding is 0.185, 0.122, 0.128, and 0.4211,

After the training of RKGE, the result is even worse than pre_embedding. Could you kindly show the complete version of your code?

55TFSI avatar Nov 18 '20 16:11 55TFSI

I meet the same problem. It's not provided how to get the pre_embedding in the paper. After loading the pre_embedding, the performance is worse and worse with the traning of more epochs.

love-searcher avatar Sep 30 '22 02:09 love-searcher

Have you solved it yet?

love-searcher avatar Sep 30 '22 02:09 love-searcher

有问题很正常,我觉得这个代码有问题,明明训练的是获取路径的打分,那么验证的时候应该也是分析每条路径的打分,然后排序推荐,但是这个代码里面的验证却是直接用【user的嵌入】与【item的嵌入】求内积来打分推荐,没搞懂

Suasy avatar Mar 17 '23 06:03 Suasy