nnUNet
nnUNet copied to clipboard
best model maybe not right in multi-class segmentation
Hi FabianIsensee, thank you so much for your great work. it helps a lot. I found an issue about model_best save in multi_class segmentation. In my case, there is 2 kinds of foregrounds. the one is easy to recognize, let's say it is A, its metric is high, for example 0.9. The other one is difficult, let's say it is B, its metric is lower, for example 0.6. The mean metric is 0.75. But if B's metric is None sometime, and the mean metric is 0.9. it is higher than the normal mean metric. And the best model is saved in when B's metric is None, and it is can not be updated in later the normal iterations since the metric 0.9 is higher.
Hey I see where this problem comes from. Unfortunately I don't see a good solution to this. What do you propose we do in this case? Simply not update the moving average?
The thing is that this is a super rare case. For this to happen the model would have to not see some class in the entire validation set and also correctly not predict it. Are you certain that everything is OK with your tyraining?
Hey I see where this problem comes from. Unfortunately I don't see a good solution to this. What do you propose we do in this case? Simply not update the moving average?
yes, the simple solution is do not update average in the wrong case. I haven't thought of a better way.
The thing is that this is a super rare case. For this to happen the model would have to not see some class in the entire validation set and also correctly not predict it. Are you certain that everything is OK with your tyraining?
The training looks normal, but the class B is very small, sometimes the result is 0, sometimes it is none, sometimes it is 0.7.
0 = False positives were predicted (class was not present during validation) OR class was there and was not predicted none = class was not present and no false positives were predicted 0.7 = class was there and was predicted
I recommend using the latest/final checkpoint anyways. Unless there is a really good reason not to