PhyDNet icon indicating copy to clipboard operation
PhyDNet copied to clipboard

MSE obtained far higher from reported results

Open eugenelet opened this issue 3 years ago • 4 comments

Hi Authors,

I ran the code using the default configuration on Moving MNIST by directly executing python3 main.py.

The final MSE obtained after 1000 epochs is around 75.26 which is far higher than the MSE reported in the paper which is 24.4. Is there anything that I'm missing here? Thanks! image

Eugene

eugenelet avatar Apr 26 '21 05:04 eugenelet

Hi @eugenelet , for improving reproducibility, I have uploaded an improved version of PhyDNet, with separate encoders and decoders (more details in the paper at the CVPR OmniCV workshop 2020: https://openaccess.thecvf.com/content_CVPRW_2020/papers/w38/Le_Guen_A_Deep_Physical_Model_for_Solar_Irradiance_Forecasting_With_Fisheye_CVPRW_2020_paper.pdf). I have also uploaded the pretrained model, which attains MSE=24,19 (better than in the CVPR paper). In particular, we found that the batch size has a crucial impact on performances, we fixed it at 16 for this model. Best, Vincent

vincent-leguen avatar Apr 28 '21 20:04 vincent-leguen

Hi @vincent-leguen , thanks for releasing the pretrained model and the updated configuration of the code. I'll re-run the code from scratch to validate the reported performance. This is an interesting field to contribute to.

Eugene

eugenelet avatar May 03 '21 09:05 eugenelet

Hi @vincent-leguen , for the recently updated code, do I run using the default configs, i.e. python3 main.py? At epoch 1000, I obtained MSE of 38.62 which is still off from the reported results.

eugenelet avatar May 05 '21 05:05 eugenelet

Hi @vincent-leguen , for the recently updated code, do I run using the default configs, i.e. python3 main.py? At epoch 1000, I obtained MSE of 38.62 which is still off from the reported results.

I find their model is sensitive to batch size. U should make sure your batch size is 16. I think maybe the GroupNorm cause it. When I set the batch size with 16, it work well.

toddwyl avatar Jul 16 '21 06:07 toddwyl