lsgan
lsgan copied to clipboard
Why ReLU at line 120?
Hi, Prof. Qi,
I am wondering why there is a ReLU layer at the top of discriminator (shown at line 120 in lsgan.lua)? With this ReLU layer, I found the code cannot train the model at all.
thanks,
I think it's for the non-negative loss function
新手,这个要是用什么软件运行?