Semi-supervised-learning icon indicating copy to clipboard operation
Semi-supervised-learning copied to clipboard

Why is SAF loss negative,FreeMatch

Open yumudry opened this issue 1 year ago • 0 comments

   # calculate entropy loss
        if mask.sum() > 0:
           ent_loss, _ = entropy_loss(mask, logits_x_ulb_s, self.p_model, self.label_hist)
        else:
           ent_loss = 0.0
        # ent_loss = 0.0
        total_loss = sup_loss + self.lambda_u * unsup_loss + self.lambda_e * ent_loss

    out_dict = self.process_out_dict(loss=total_loss, feat=feat_dict)
    log_dict = self.process_log_dict(sup_loss=sup_loss.item(), 
                                     unsup_loss=unsup_loss.item(), 
                                     ent_loss=ent_loss.item(),
                                     total_loss=total_loss.item(), 
                                     util_ratio=mask.float().mean().item())
   ent_loss=ent_loss.item()             

Ent_ross, I printed this loss value and it's all negative
image

yumudry avatar Sep 25 '24 07:09 yumudry