self-supervised-multi-task-aesthetic-pretraining icon indicating copy to clipboard operation
self-supervised-multi-task-aesthetic-pretraining copied to clipboard

Why does your public pretrained model have the same result as a model that trains from scratch?

Open ducanhnguyen-lab opened this issue 2 years ago • 1 comments

image

Your method doesn't improve NIMA at all. Can you explain this problem?

ducanhnguyen-lab avatar Aug 24 '22 07:08 ducanhnguyen-lab

Hi, sadly this issue contains next to no information for me to understand what is going on, as I don't know what the blue and the grey lines are and what exactly you did to get these results. I also can't really match either line to our results or NIMAs, as our reported results as well as the reported numbers in the original NIMA paper are very different from all scores visible in your screenshot. What I do can say though is that when you take a look at our paper and reported numbers we also do not claim a huge performance improvement over NIMA. So without further information it is impossible for me to determine what the issue at hand might be, but given the discrepancy between your results and the original reported results in NIMA it might even be an issue with the training process in general.

janpf avatar Aug 26 '22 13:08 janpf