Results 54 comments of Liang Wang

Not sure what you are asking... Re-ranking has nothing to do with the contrastive loss.

是的,paper里的re-ranking只是一个后处理,对模型本身并没有任何影响,也没有梯度回传。

Have you made any code changes? And what's your version of `transformers` and `pytroch`? I cannot reproduce your issue.

`--pre-batch 0` means it does not use pre-batch negatives. Remove `--use-self-negative` if you do not want to use self-negatives. In batch negatives are always used since they are almost free...

batch size 就是对结果影响最大的参数,batch size小了,对比学习的效果就会下降

I am afraid so, you should find ways to have some GPU for model training.

The logits are cosine similarities scaled by temperature, so you'll see values larger than 1.

1. We map each relation id to its string format, please check out `preprocess.py` for details. 2. Yes, the inverse relations are simply handled by adding "inverse" prefix, BERT will...

As said in the paper: ``` During training, there may exist some false negatives. For example, the correct entity happens to appear in another triple within the same batch. We...