ACGAN-PyTorch icon indicating copy to clipboard operation
ACGAN-PyTorch copied to clipboard

error in the loss

Open devraj89 opened this issue 6 years ago • 2 comments

Hi

Thanks for publishing the code in Pytorch ! I have a few questions however. [1] for the loss associated with the auxilliary classifier fc you are using NLL Loss but the last layer is Softmax layer. Shouldn't it be LogSoftmax instead of Softmax ?

[2] I am wondering why is the noise in line 201 generated using the class_one_hot vector representation ? Cannot we use simply the noise as generated in Line 196? Did you find any improvements with that specific noise generation ?

Also instead of randomly generating label as in Line 197 can't we use the label that have been sampled from the data loader i.e., Line 177

[3] Also based on the figure given in the main page (the last figure to the right), it is shown that class information i.e., C_class is given to both the latent variable z and before the discriminator D (on X_real and X_fake ) in the training stage. However in the code, it seems to be missing. Can you please clarify why is that?

Please refer to this https://github.com/znxlwm/pytorch-generative-model-collections/blob/master/ACGAN.py

Thank you in advance for the wonderful code.

devraj89 avatar Jun 12 '18 08:06 devraj89

Hi, devraj89, about your problems, do you deal with? and about cifar10 , can you get the 8.6 score result ?

Arrcil avatar Nov 10 '18 10:11 Arrcil

I think this is a bug. Changing NLLloss to CrossEntroyLoss and removes the softmax layer solves it. [1] for the loss associated with the auxilliary classifier fc you are using NLL Loss but the last layer is Softmax layer. Shouldn't it be LogSoftmax instead of Softmax ?

zhangyixing0404 avatar Jul 11 '19 17:07 zhangyixing0404