Consider checking in CI that we don't commit large binary blobs
Hm, this PR added 25mb binary blob to the shared git repository. This is less than ideal, as git is pretty bad at dealing with large files, in a sense that now everyone needs to download extra 25 mbs when cloning the repo. This seems like a rather large cost, considering we have to pay it indefinitelly (even if we remove the file, it'll still be in the history).
I suggest begin more careful with large files in the future. Perhaps we should add a CI check that there isn't anything lanrger than 0.5 meg in the repo?
Originally posted by @matklad in https://github.com/near/nearcore/issues/6778#issuecomment-1139689651
For the future xz or bzip2 instead of gzip might improve situation. Though at the end of the day, if adding a 25MB file is the best way to test something we shouldn’t shy away from adding it (especially since 25MB is nothing compared to other things developers need to download).
Yeah, agree that this shouldn't be an absolute rule. Though, in such cases, we should put considerable thinking whether this is indeed the "best way".
I wouldn't call 25mb nothing -- slow git clone is a rather perceptible bump for new contributors. But the reason why I feel relatively strongly about this is that this, as far as I know, isn't something which can be fixed later. Repo size grows monotonically, unlike, eg, compilation times, which can always be improved with some effort.