SSGAN-Tensorflow
SSGAN-Tensorflow copied to clipboard
Why the g_loss becomes negative?
When I was training, the g_loss sometimes will become negative and then accuracy will suddenly fall down. Does that mean my training has failed?
Ideally, the g_loss shouldn't be negative. Can you describe the issue more accurately?
I also get the negative g_loss when I train mnist dataset,(g_loss is always negative). I have no idea why I got the negtive g_loss
I also get the negative g_loss when I train.
I also get the negative g_loss when I train.
I also get the negative g_loss when I train.
I think the g_logloss should be g_loss = tf.reduce_mean(-tf.log(1-d_fake[:, -1]+1e-7)) but not the author's provided, because the d_fake's range is [0-1] and ln(d_fake) is always negative and the weight annealing loss will be zero when the training step is 1500, but when you do the evaluation, you get a positive g_loss because the ann_weight is always 1. Thanks!
Ideally, the g_loss shouldn't be negative. Can you describe the issue more accurately?
In fact, the g loss will always be negative when the train_step reaches 1500[抠鼻]