MirrorGAN
MirrorGAN copied to clipboard
Why is the glu called relu?
Thank you for the code. I was just wondering if there is a reason why the activation of one of the dense layer is called relu, although it is a glu? It really confuses me. C.f. Line 274 of the model:
self.relu = GLU()