ganhacks icon indicating copy to clipboard operation
ganhacks copied to clipboard

how does trick 6 change ideal output activation/loss

Open mjdietzx opened this issue 7 years ago • 1 comments

Regarding Trick 6:

What activation do you recommend for the discriminator's real_or_generated output layer? 'm using a sigmoid activation function but since the output can be as large as 1.2 I'm wondering if something like leaky relu would be better since sigmoid activation fcn is [0, 1.0]. And would you still recommend using binary_crossentropy as loss for this output or something else now that using soft labels?

mjdietzx avatar Jan 22 '17 20:01 mjdietzx

Or you can keep sigmoid, and do random uniform sampling in interval [0.7, 1.0] for real labels.

dantp-ai avatar Aug 09 '17 12:08 dantp-ai