DiffBIR icon indicating copy to clipboard operation
DiffBIR copied to clipboard

optimizer selection

Open xuanxu92 opened this issue 3 months ago • 6 comments

hi, thanks for this excellent work. I have noticed that in the code of train-stage1.py line 106, the optimizer is AdamW opt = torch.optim.AdamW( swinir.parameters(), lr=cfg.train.learning_rate, weight_decay=0 ) but in the paper, 4.1 implemntations you mention adam is utilized, could you please clarify it is adam or adamW?

Thanks!

xuanxu92 avatar May 07 '24 19:05 xuanxu92