SimKGC
SimKGC copied to clipboard
Negative sampling
I want to ask a very simple but important question for me, that is, as a novice, I can't accurately find the three negative sampling parts in the code. Can you help me point out? I need this answer very much. I would be very excited if you can give me an answer
The in-batch negatives simply use samples from the same batch: https://github.com/intfloat/SimKGC/blob/97cc43e488f19ca5b0f6fbf60ffefd2ee56c0693/models.py#L91
The pre-batch negatives use cached negatives from previous batches: https://github.com/intfloat/SimKGC/blob/97cc43e488f19ca5b0f6fbf60ffefd2ee56c0693/models.py#L117-L133
The self-negatives use the head entities of each example: https://github.com/intfloat/SimKGC/blob/97cc43e488f19ca5b0f6fbf60ffefd2ee56c0693/models.py#L104-L109
Thank you for your detailed answer. Is all the parameters used in the self-negative sampling method fixed? If I want to use this method in other frameworks, do the parameters used here also have to be?
Not sure what you are referring to... Self-negative sampling does not add any new parameters.
Sorry, I didn't make it clear. I didn't communicate with you in a timely manner due to some issues.
I mean the parameters in the self negative sampling of lines 104-109 of the models.py file, such as head_ vector、self_ neg_ logits... and logits.Are these all necessary parameters?
In addition, I found that the negative sampling part belongs to the compute_ logits function has only been called twice by trainer.py (lines 114/154), respectively eval_epoch function, train_ epoch function, I don't quite understand this part.Could you provide me with an answer? Thank you and look forward to your reply.".
Not sure what you are referring to... Self-negative sampling does not add any new parameters.
May I know why you set -1e4 for masked_fill?
Not sure what you are referring to... Self-negative sampling does not add any new parameters.
May I know why you set -1e4 for masked_fill?
Any number that is small enough should be able to mask the probability to 0.