WDSR-pytorch
WDSR-pytorch copied to clipboard
an issue with weight_norm
torch.nn.utils.weight_norm
does the crucial weight re-parametrization during training, but does it happen during testing also? If yes should not they be removed before proceeding for testing?