WDSR-pytorch icon indicating copy to clipboard operation
WDSR-pytorch copied to clipboard

PyTorch implementation of Wide Activation for Efficient and Accurate Image Super-Resolution (CVPR Workshop 2018)

Results 2 WDSR-pytorch issues
Sort by recently updated
recently updated
newest added

[epoch: 1/300] lr: : 0it [00:00, ?it/s] train: 0%| | 0/16000 [00:00

`torch.nn.utils.weight_norm` does the crucial weight re-parametrization during training, but does it happen during testing also? If yes should not they be removed before proceeding for testing?