mcr2 icon indicating copy to clipboard operation
mcr2 copied to clipboard

Loss is negative when use MCR2 to my own scenario

Open adf1178 opened this issue 3 years ago • 4 comments

I tried to use this loss in my own CIFAR-100-long tail scenario, but the loss is negative, meaning that compress_loss_emprical is larger than discrimn loss emprical. What is wrong ?

adf1178 avatar May 28 '21 07:05 adf1178

Hi, I would make sure that Z is normalized before you put it through the loss function. Another possibility is the number of classes is not set correctly. I would try those easy fixes first. In general, the R loss should be bigger than R_c loss, and both loss should be positive, hence the total loss should be positive also. Good luck!

ryanchankh avatar May 30 '21 22:05 ryanchankh

Could you please check if the loss.item() in python3 train_sup.py --arch resnet18 --data cifar10 --fd 128 --epo 50 0 --bs 1000 --eps 0.5 --gam1 1 --gam2 1 --lr 0.01 --lcr 0.0 is negative?

pytorch version: Version: 1.9.0+cu111

wetliu avatar Jul 18 '21 07:07 wetliu

I will try it when I have time. I am busy with my own experiments and if I have results I will contact you. Thanks again for you reply.

adf1178 avatar Jul 18 '21 07:07 adf1178

Could you please check if the loss.item() in python3 train_sup.py --arch resnet18 --data cifar10 --fd 128 --epo 50 0 --bs 1000 --eps 0.5 --gam1 1 --gam2 1 --lr 0.01 --lcr 0.0 is negative?

pytorch version: Version: 1.9.0+cu111

loss func is defined as loss=compress_loss-discrimn_loss , so neg value is correct.

zhrli avatar Aug 05 '21 02:08 zhrli