cwgan-gp icon indicating copy to clipboard operation
cwgan-gp copied to clipboard

D_loss get negative value

Open apaaumao opened this issue 6 years ago • 3 comments

I used your code, and get negative D_loss value. I thought that it would converge to a positive value close to 0, but negative values do not matter? Thanks.

apaaumao avatar Jan 18 '19 06:01 apaaumao

D_loss is composed of wasserstein loss (K.mean(y_true * y_pred)) so it can be negative.

kongyanye avatar Jan 18 '19 07:01 kongyanye

I think that wasserstein loss is mean(D(fake_data)) - mean(D(real_data), but is it also problem with mean(y_true * y_pred))? I am a beginner in this field. Please give me advice. Thanks.

apaaumao avatar Jan 19 '19 04:01 apaaumao

In the code given by author of WGAN-GP (https://github.com/igul222/improved_wgan_training/blob/master/gan_64x64.py), the loss are defined as below:

disc_cost = tf.reduce_mean(disc_fake) - tf.reduce_mean(disc_real) + LAMBDA*gradient_penalty gen_cost = -tf.reduce_mean(disc_fake)

When generating output for discriminator, there's no activation function to restrict it. So the value can be negative.

kongyanye avatar Jun 21 '19 06:06 kongyanye