pytorch-CycleGAN-and-pix2pix icon indicating copy to clipboard operation
pytorch-CycleGAN-and-pix2pix copied to clipboard

Loss D goes to zero when adding a new loss to generator

Open John1231983 opened this issue 5 years ago • 20 comments

I have successfully run cycleGAN in my dataset. The D, G, cycle, idt loss are normal. However, when I add a new loss to the cyclegan. The discriminator loss easy goes down to 0, the results of the generator look terrible. It seems D easily judges the generated outputs when the D loss goes to 0. What is my problem? How should I fix it? I do not think D loss goes to zero is normal

John1231983 avatar Apr 23 '19 04:04 John1231983

Maybe your newly added loss is too strong compared to existing CycleGAN losses.

junyanz avatar Apr 28 '19 21:04 junyanz

So, could you suggest any solutiom to solve the issue?

John1231983 avatar May 10 '19 01:05 John1231983

Maybe reducing the weight for your new loss.

junyanz avatar May 10 '19 22:05 junyanz

@junyanz : Thanks. I have removed the new loss and the issue still maintain. I guess that cycleGAN has the issue when I use it on the new dataset. For traditional GAN, they said that loss D goes to zero mean the model is a failure (tip 10. Do you know how to handle with the issue

John1231983 avatar May 19 '19 16:05 John1231983

maybe making the discriminator weaker or the generator stronger.

junyanz avatar May 20 '19 05:05 junyanz

I'm also facing the same issue. I thought about increasing the number of times the generator gets updated when compared to the discriminator or decreasing the learning rate of the discriminator. What are other ways in which generator can be made stronger / discriminator can be made weaker? @junyanz

vrao9 avatar May 22 '19 13:05 vrao9

You can set different learning rates for G and D. See this paper for more details.

junyanz avatar May 22 '19 17:05 junyanz

Thank you

vrao9 avatar May 23 '19 09:05 vrao9

@vrao9: do you find the good solution to solve the issue? I found that reduce number layer of D can solve it. I did not test your suggested solution: reduce lr, or increasing training G

John1231983 avatar May 26 '19 05:05 John1231983

@John1231983 good to know that! I tried 2 things:

  1. reduce the learning rate: it did help in solving the problem
  2. update the discriminator only when accuracy of prediction by the discriminator for fake image is less than 50%: there were less updates in the discriminator weights because of this new rule and discriminator loss did not go to zero.

vrao9 avatar May 26 '19 21:05 vrao9

"accuracy of prediction by the discriminator for fake image is less than 50%: " how do you do it in pytorch? Thanks

John1231983 avatar May 27 '19 01:05 John1231983

I used the keras implementation: https://github.com/eriklindernoren/Keras-GAN/tree/master/cyclegan I don't know how it can be done in pytorch.

vrao9 avatar May 28 '19 13:05 vrao9

@vrao9 : Thanks. Which line to do "accuracy of prediction by the discriminator for fake image is less than 50%: "?

John1231983 avatar May 28 '19 13:05 John1231983

Here, dB_loss_fake[1] contains the 'accuracy of prediction' for fake images.

vrao9 avatar May 29 '19 13:05 vrao9

How to print accuracy with the other details printed such as epoch and various losses.where to add metric=[accuracy]

nehaleosharma avatar Jul 30 '19 08:07 nehaleosharma

wow,tahks,but really what I want is a direct PDF file.

------------------ 原始邮件 ------------------ 发件人: "nehaleosharma"[email protected]; 发送时间: 2019年7月30日(星期二) 下午4:54 收件人: "junyanz/pytorch-CycleGAN-and-pix2pix"[email protected]; 抄送: "Subscribed"[email protected]; 主题: Re: [junyanz/pytorch-CycleGAN-and-pix2pix] Loss D goes to zero whenadding a new loss to generator (#626)

How to print accuracy with the other details printed such as epoch and various losses.where to add metric=[accuracy]

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

Legendsevl avatar Jul 30 '19 09:07 Legendsevl

Here, dB_loss_fake[1] contains the 'accuracy of prediction' for fake images.

Did you change the code in base keras 'train_on_batch' to get the accuracy of prediction? Or just give it a parameter (sample_weight=0.5). I am not really understand how it operated.

JerryLeolfl avatar Sep 19 '19 15:09 JerryLeolfl

@junyanz hey professor? Could you tell me if there is any suggection to weaker the discriminator instead of changing LR?

annihi1ation avatar Feb 09 '20 06:02 annihi1ation

@junyanz Or more concretely, I changed the BCE loss to MSE loss (according to the LSGAN). The model just failed, d loss is close to 0 while the g loss keeping increasing

annihi1ation avatar Feb 09 '20 06:02 annihi1ation

2. update the discriminator only when accuracy of prediction by the discriminator for fake image is less than 50%: there were less updates in the discriminator weights because of this new rule and discriminator loss did not go to zero.

Hi, may I ask if you used bce loss or mse loss? if you use mse loss, the loss won't be bounded between 0-1.

JunCS1 avatar Sep 20 '23 02:09 JunCS1