KAIR icon indicating copy to clipboard operation
KAIR copied to clipboard

Problem with SwinIR Urban100 & Manga109 replication

Open YuchuanTian opened this issue 1 year ago • 1 comments

As I replicate SwinIR SR x3&x4 on DIV2K dataset, I encountered a drop on U100&M109 testsets compared to results reported in the paper:

All training are done with the original DIV2K dataset, without lmdb or patch preprocessing.

SR x3

Only the following configs in options/swinir/train_swinir_sr_classical.json are changed:

opt['scale']=3
opt['datasets']['train']['H_size']=144
opt['netG']['upscale']=3

Test script parsers are:

python main_test_swinir.py --task classical_sr --scale 3 --training_patch_size 48

The results are:

SR x3 (PSNR) Set5 Set14 B100 U100 M109
Paper 34.89 30.77 29.37 29.29 34.74
Replication 34.89 30.75 29.35 29.22 34.66

SRx4

Only the following configs in options/swinir/train_swinir_sr_classical.json are changed:

opt['scale']=4
opt['datasets']['train']['H_size']=192
opt['netG']['upscale']=4

Test script parsers are:

python main_test_swinir.py --task classical_sr --scale 4 --training_patch_size 48

The results are:

SR x4 (PSNR) Set5 Set14 B100 U100 M109
Paper 32.72 28.94 27.83 27.07 31.67
Replication 32.74 28.98 27.82 26.94 31.49

This is even more strange when I could successfully replicate DF2K experiments with almost the same configs (but in training patch size 64). What might be the problem? Thank you very much! Also, thanks for the open-sourcing of this wonderful repo!

YuchuanTian avatar Jan 16 '23 08:01 YuchuanTian