Kaggle-Carvana-Image-Masking-Challenge
Kaggle-Carvana-Image-Masking-Challenge copied to clipboard
get 0.995 with 1024 * 1024
have changed the batch size to 1, any suggestion?
I got my 0.996 result with batch size = 2. Training becomes more unstable with small batch sizes and I think Batchnorm can't work at all with batch size = 1.
You could try: a) Remove Batchnorm layers on the 1024 Unet, this may also allow training with a larger batch size. b) Use RMSprop with lr = 0.0001 instead of SGD. I find it converges faster and to lower loss than SGD. I tested with 128 Unet and got me 0.991 up from 0.990, now testing with 1024 Unet.
cool
get 0.996 with batch size = 2 but still how to imporve ?