edm icon indicating copy to clipboard operation
edm copied to clipboard

Can not reproduct the result in Table 2.

Open Zyriix opened this issue 1 year ago • 5 comments

I run the following training command on 8 v100.

torchrun --standalone --nproc_per_node=8 train.py --outdir=training-runs
--data=datasets/cifar10-32x32.zip --cond=0 --arch=ddpmpp

This gives me a FID at 2.08169 which is still far from FID in the paper(1.97).

I think this may caused by the random seeds(Random init in the code). Is it possible to share the seed for reprodcuting the result in table2?

Any suggestiones?

Zyriix avatar Nov 12 '23 04:11 Zyriix

I observed the FID of EDM fluctuates largely. Did you report the lowest FID among checkpoints?

Newbeeer avatar Nov 12 '23 15:11 Newbeeer

I observed the FID of EDM fluctuates largely. Did you report the lowest FID among checkpoints?

I found this in some papers that try to retrain Edm. I got the lowest FID(2.04053) at around 16w ckpt. I think the fluctuation is caused by lower ema rate and larger learning rate.

Zyriix avatar Nov 13 '23 14:11 Zyriix

@Zyriix To calculate FID we have to first generate, saying 50000 samples. The random seed can influence the generation process of these samples hence the FID value I guess

yuanzhi-zhu avatar Nov 13 '23 15:11 yuanzhi-zhu

@Zyriix To calculate FID we have to first generate, saying 50000 samples. The random seed can influence the generation process of these samples hence the FID value I guess

Most works use the same sample seeds just like this repo: 0~49999, 50000~99999, 100000~149999 and report the lowest one.

Zyriix avatar Nov 18 '23 02:11 Zyriix

--arch=ncsnpp performs much better than VP architecture

Liyuan-Liu avatar Jan 29 '24 18:01 Liyuan-Liu