simclr
simclr copied to clipboard
Spikes in training loss
I'm wondering if you have encountered this recurring spike in the training loss and why might this occur.
It does appear to occur upon processing the last batch but I don't think it has to do with uneven batch size because the last batch is dropped. The data is shuffled so it can't be any particular examples causing this. But it's not clear why the processing of the last batch would produce these spikes.
Thanks.
@slala2121 I'm actually experiencing the exact same behavior, and I'm also shuffling + dropping the last batch. Did you happen to figure out the reason or resolve this?