generative-recommenders
generative-recommenders copied to clipboard
confused about "SampledSoftmaxLoss" func
Hey, Congratulations for your perfect and creative work. when I read the implementation code here, I am very confused about SampledSoftmaxLoss. I have some questions for this:
- why do we use "supervision_ids" to calculate "positive_logits"?
- why wu use "InBatchNegativesSampler" to random sample negative samples and calculate "negative_logits"?
- what does the "self._model.interaction" do?
- for jaggled_loss, why need to firstly concat in 1 dim and then calculate log_softmax in 1 dim and last pick the 0 dim?
Please give me some advice if you are free, thanks~