batchgenerators
batchgenerators copied to clipboard
Error in augment_gamma with retain_stats
In augment_gamma there is the option to retain the mean and standard deviation of the image before transformation.
The current implementation (see below) does not restore the mean mn
and standard deviation sd
correctly:
https://github.com/MIC-DKFZ/batchgenerators/blob/d88358459931188bbc2e65ca86f5f7cd47c26c20/batchgenerators/augmentations/color_augmentations.py#L116-L117
As a result, the mean value of data_sample[c]
will be mn / (data_sample[c].std() + 1e-8) * sd
and not mn
.
The correct way to do it would be:
data_sample[c] = data_sample[c] - data_sample[c].mean()
data_sample[c] = data_sample[c] / (data_sample[c].std() + 1e-8) * sd
data_sample[c] = data_sample[c] + mn
Hi Lucas, you are right. This of course incorrect! Thank you for pointing it out. I am extremely busy this week and will be on vacation after. Once I am back I will fix this. Best, Fabian
Hi Lucas, I have now fixed this in the master but will hold off with creating a new version until I have it evaluated experimentally (just to be safe). If you would like to use batchgenerators with your fix you can just install it from github instead of using pip. Once again, thanks! Best, Fabian
Hi Fabian,
Thank you for fixing this quickly!
I suspect that most of time this was used with mn=0
, in which case the former implementation happened to be equivalent to the new one. So I would be surprised if it affects the experimental results, but it is better to be safe I agree!
Best,
Lucas
Keep in mind that in nnU-Net preprocessing will normalize the entire images to have (most of the time, not always) mean 0 and std 1. During training we are only showing small patches which may have very different statistics. So there may be a difference, but I still suspect that the outcome is not going to change. We will know more after my vacation Best, Fabian