cwgan-gp
cwgan-gp copied to clipboard
D_loss get negative value
I used your code, and get negative D_loss value. I thought that it would converge to a positive value close to 0, but negative values do not matter? Thanks.
D_loss is composed of wasserstein loss (K.mean(y_true * y_pred)) so it can be negative.
I think that wasserstein loss is mean(D(fake_data)) - mean(D(real_data), but is it also problem with mean(y_true * y_pred))? I am a beginner in this field. Please give me advice. Thanks.
In the code given by author of WGAN-GP (https://github.com/igul222/improved_wgan_training/blob/master/gan_64x64.py), the loss are defined as below:
disc_cost = tf.reduce_mean(disc_fake) - tf.reduce_mean(disc_real) + LAMBDA*gradient_penalty gen_cost = -tf.reduce_mean(disc_fake)
When generating output for discriminator, there's no activation function to restrict it. So the value can be negative.