pytorch-vdsr
pytorch-vdsr copied to clipboard
VDSR (CVPR2016) pytorch implementation
您好!我使用提供的train.h5按照main_vdsr.py中的超参数进行重新训练,经过50个epochs后按照eval.py进行测试得到的结果却是和bicubicu的值差不多,请问这是什么原因? 测试结果如下: Processing Set5_mat/baby_GT_x2.mat Processing Set5_mat/head_GT_x2.mat Processing Set5_mat/butterfly_GT_x2.mat Processing Set5_mat/woman_GT_x2.mat Processing Set5_mat/bird_GT_x2.mat Scale= 2 Dataset= Set5 PSNR_predicted= 33.69038816258724 PSNR_bicubic= 33.69039381292539 It takes average 7.9746935844421385s for processing Processing Set5_mat/bird_GT_x3.mat Processing Set5_mat/head_GT_x3.mat...

Hi,can you provide train datasets you use to achieve VDSR PyTorch result?
想请教一下,运行这个代码需要配置什么环境呀
According to this: https://github.com/peterjc123/pytorch-scripts in the `Using Examples` section. If you set `--threads=0` manually instead of the default `opt.threads=1` value, the issue will be resolved by forcing the data to...
我有一个疑问,那就是我看了你很多的implementation,你都是用YCBCR的Y来训练。而且你计算PSNR也都是用一个Y channel来计算的。 那么现在我有一张图,没有GT做比对,你只用Y来行进2倍或者4倍扩大,那么剩下的两个CBCR怎么办呢?双三次扩大?那么边缘对不齐啊。按照你这个思路,最后怎么才能得到RGB图呢? 为什么不直接RGB丢进去得到一个RGB的输出?
after run demo.py and eval.py, the PSNR value is not the same. what's the reason?
Hi, dear Jiu, thank you for sharing the code. I use the given data generation code to generate the corresponding. h5 file for 291 images, but running main_ vdsr.py ,the...
I generated the augmentated date with the matlab script and 291 images, and the train.h5 is about 15GB. In the paper the training procedure "takes roughly 4 hours on GPU...