TadGAN icon indicating copy to clipboard operation
TadGAN copied to clipboard

The loss of the encoder and decoder is very high

Open zhaotianzi opened this issue 4 years ago • 3 comments

i have use the model in my own dataset ,But after a long time of training, the loss is still very high.Can you tell me how to reduce my loss?

DEBUG:root:critic x loss -30.026 critic z loss 0.416 encoder loss 1265.846 decoder loss 1235.256

zhaotianzi avatar Sep 22 '21 02:09 zhaotianzi

Two pointers:

  • After training, the encoder and decoder loss should start reducing because the generator starts to learn a good mapping.
  • The loss for critic should be increasing in magnitude. The reason is that a high critic loss implies that the critic (discriminator) is able to distinguish between fake samples and real samples really well. As it learns, the critic loss must increase.

arunppsg avatar Sep 24 '21 12:09 arunppsg

Two pointers:

  • After training, the encoder and decoder loss should start reducing because the generator starts to learn a good mapping.
  • The loss for critic should be increasing in magnitude. The reason is that a high critic loss implies that the critic (discriminator) is able to distinguish between fake samples and real samples really well. As it learns, the critic loss must increase.

Hi Arun, The experiment results are just the opposite in my own datasets. After training, the encoder and decoder loss increased, and the critic loss reduced. At the same time, the model performed badly when i used the model to perform anomaly detection in anomaly datasets.

jakcic avatar Dec 14 '21 07:12 jakcic

Encoder and decoder loss should decrease as they learn a better mapping - I observed it in training log file. I am not sure what is wrong in your case, maybe you are seeing encoder as critic and vis-a-vis? Training of GANs are highly unstable and require a good amount of computation power. Try retraining it.

arunppsg avatar Dec 14 '21 09:12 arunppsg