learning-to-rank icon indicating copy to clipboard operation
learning-to-rank copied to clipboard

loss function compute problem

Open speeding-motor opened this issue 6 years ago • 1 comments

HI, when I read you code , it make me refused here, in the the LIstNet loss function:

def get_loss(self, x_t, y_t):
        # ---- start loss calculation ----
        ...
        p_true = F.softmax(F.reshape(y_t,(y_t.shape[0],y_t.shape[1])))
        xm = F.max(pred,axis=1,keepdims = True)
        logsumexp = F.logsumexp(pred,axis=1)

        logsumexp = F.broadcast_to(logsumexp,(xm.shape[0],pred.shape[1]))
        loss = -1 * F.sum( p_true * (pred - logsumexp) )
        ...

here use p_true * (red- logsumexp), why do like this, you are exactly not follow the paper to compute the loss function right?

speeding-motor avatar Aug 07 '19 13:08 speeding-motor

HI, when I read you code , it make me refused here, in the the LIstNet loss function:

def get_loss(self, x_t, y_t):
        # ---- start loss calculation ----
        ...
        p_true = F.softmax(F.reshape(y_t,(y_t.shape[0],y_t.shape[1])))
        xm = F.max(pred,axis=1,keepdims = True)
        logsumexp = F.logsumexp(pred,axis=1)

        logsumexp = F.broadcast_to(logsumexp,(xm.shape[0],pred.shape[1]))
        loss = -1 * F.sum( p_true * (pred - logsumexp) )
        ...

here use p_true * (red- logsumexp), why do like this, you are exactly not follow the paper to compute the loss function right?

But why do you calculate the loss function in this way? is that work well?

speeding-motor avatar Aug 07 '19 13:08 speeding-motor