mcr2
mcr2 copied to clipboard
Loss is negative when use MCR2 to my own scenario
I tried to use this loss in my own CIFAR-100-long tail scenario, but the loss is negative, meaning that compress_loss_emprical is larger than discrimn loss emprical. What is wrong ?
Hi, I would make sure that Z is normalized before you put it through the loss function. Another possibility is the number of classes is not set correctly. I would try those easy fixes first. In general, the R loss should be bigger than R_c loss, and both loss should be positive, hence the total loss should be positive also. Good luck!
Could you please check if the loss.item() in python3 train_sup.py --arch resnet18 --data cifar10 --fd 128 --epo 50 0 --bs 1000 --eps 0.5 --gam1 1 --gam2 1 --lr 0.01 --lcr 0.0
is negative?
pytorch version: Version: 1.9.0+cu111
I will try it when I have time. I am busy with my own experiments and if I have results I will contact you. Thanks again for you reply.
Could you please check if the loss.item() in
python3 train_sup.py --arch resnet18 --data cifar10 --fd 128 --epo 50 0 --bs 1000 --eps 0.5 --gam1 1 --gam2 1 --lr 0.01 --lcr 0.0
is negative?pytorch version: Version: 1.9.0+cu111
loss func is defined as loss=compress_loss-discrimn_loss , so neg value is correct.