SSGAN-Tensorflow icon indicating copy to clipboard operation
SSGAN-Tensorflow copied to clipboard

Why the g_loss becomes negative?

Open Riaux opened this issue 6 years ago • 6 comments

When I was training, the g_loss sometimes will become negative and then accuracy will suddenly fall down. Does that mean my training has failed?

Riaux avatar Sep 24 '18 07:09 Riaux

Ideally, the g_loss shouldn't be negative. Can you describe the issue more accurately?

shaohua0116 avatar Nov 21 '18 13:11 shaohua0116

I also get the negative g_loss when I train mnist dataset,(g_loss is always negative). I have no idea why I got the negtive g_loss

howardgriffin avatar Feb 20 '19 02:02 howardgriffin

I also get the negative g_loss when I train.

Jingwen7 avatar Apr 19 '19 20:04 Jingwen7

I also get the negative g_loss when I train.

Timaces avatar Oct 21 '19 06:10 Timaces

I also get the negative g_loss when I train.

I think the g_logloss should be g_loss = tf.reduce_mean(-tf.log(1-d_fake[:, -1]+1e-7)) but not the author's provided, because the d_fake's range is [0-1] and ln(d_fake) is always negative and the weight annealing loss will be zero when the training step is 1500, but when you do the evaluation, you get a positive g_loss because the ann_weight is always 1. Thanks!

ahuizxc avatar Apr 01 '20 12:04 ahuizxc

Ideally, the g_loss shouldn't be negative. Can you describe the issue more accurately?

In fact, the g loss will always be negative when the train_step reaches 1500[抠鼻]

ahuizxc avatar Apr 01 '20 12:04 ahuizxc