progressive-growing-torch
progressive-growing-torch copied to clipboard
Weight normalization layer
(https://github.com/torch/nn/blob/master/WeightNorm.lua)
normalize the weights of the previous layer dynamically. need to be implemented.
Can I use simply use torch nn.WeightNorm for equalized learning rate in the paper??
https://github.com/stormraiser/GAN-weight-norm/tree/master/torch/modules
tested torch nn.WeightNorm, but it seems to harm training.
I found (https://github.com/stormraiser/GAN-weight-norm) maybe we can use this.