open_clip
open_clip copied to clipboard
`logit_scale` and resume
I'm noticing that logit_scale
will steeply change directions on resume. Probably not a huge deal as it stays fairly close to 100, but worth tracking this issue in case others encounter and its indicative of some other problem with how we clip?

@mitchellnw that's interesting, I haven't observed that before in previous runs. I went back to check across some old resumes and even in the overlap (where there were logs before a mid-epoch crash), the resumed part that overlaps was pretty much the same +/ .2-.3
I wonder if there is something re G (large model, closer to edge of stability?). I have seen sudden drops though from 100 to high 80s without any sort of resume, and then recovery back to 100.
closing because the hypothesis is that it relates to a filesystem issue which should not affect most
Here's one hypothesis for what's going on. Look at the graph for logit_scale
and samples/s
towards the end of training -- the dips in logit_scale
occur towards the end of the SCI. Perhaps this is when things are "most random" because at the beginning of the SCI it's more likely that the batch consists of many images from the same shard, especially when batch size is 160k.

Therefore, maybe on resume
, the amount of randomness was less, leading to these jumps up in logit_scale
(the model wants to be more confident because it is more overfit -- it's seen batches like this before).
@mitchellnw coming back to this one, I don't feel the explanation makes sense, logit scale dips should have no correlation with the end of SCI in terms of dataset randomness.
Each dataloader worker process across each train process (1 per GPU) is sampling with replacement a shard to read from and then reading the samples from that shard and shuffling them within a smaller shuffle buffer. There should be no noteworthy difference in data distribution that relates to the position in the checkpoint interval, even if the samples across the shards weren't shuffled (I believe @rom1504 said they were at some point?).