Hashes of 4.39.4 have changed
It looks like the hashes of the 4.39.4 was this a malicious act?
/root/python_requirements.txt (line 727)):
Expected sha256 106caf6167c4597556b31a8d9175a3fdc0356fdcd70ab19973c3b0d4c893c461
Expected or dba8d7cdb8e2bac1b3da28c5ed5960de09e59a2fe7e63bb73f5a59e57b0430d2
Got 6581d950a47980cee84c3f453a366c31e30b5e508d6c9c7ece7b73924e63fe34
Looking at pypi all the .whl files have been uploaded recently (May 23) but the source files have stayed the same from the release of 4.39.4 on May 10
yes, I did upload new wheel files yesterday (#3116) for the latest 4.39.4 release. They were built and uploaded by Github Actions from https://github.com/fonttools/fonttools-wheels, which checks out the fonttools repository as a submodule at the latest git tag. I did not make any changes to the source distribution so it is expected that for the current release 4.39.4 the source distribution is the same as before, only new binary wheels were uploaded. Is that undesired for some reason? I don't see where the issue is coming from. Who/what tool alerted you that this might be "malicious"?
I generate hashes for python dependencies with like this:
pip-compile -v --generate-hashes --output-file=/tmp/python_requirements.txt --pip-args=--no-cache-dir /tmp/python_requirements.in
https://github.com/jazzband/pip-tools#using-hashes
That ensures I always grab the exact dependency of something. Every major package manager does the same yarn.lock, package-lock.json, composer.lock just to mention some examples.
The pip install of requirements alerted me that the version was not fitting. This is how it looks in my python_requirements.txt
fonttools==4.39.4 \
--hash=sha256:106caf6167c4597556b31a8d9175a3fdc0356fdcd70ab19973c3b0d4c893c461 \
--hash=sha256:dba8d7cdb8e2bac1b3da28c5ed5960de09e59a2fe7e63bb73f5a59e57b0430d2
# via matplotlib
If you change the wheel files it would more sense to actually generate a new release IMO.
fair enough. I thought it'd be ok since there was no actual code changes, just a different way things get packaged. Ok you may argue that the presence of native code (the cython extensions) where previously were only pure python modules does warrant a version bump change. I'll remove the wheel files from the current release and will make a new release soon with those. Thanks for the heads up 👍
deleted. There is a problem though, because next time I make a release there's going to be a little lag between the time the sdist and pure python wheels get uploaded (from this fonttools/fonttools repo's CI upon pushing git tag), and the time it takes for the other wheel-builder repository to build all the dozen native wheels and upload them to PyPI... It is conceivable that one may install the pure python wheel first, freeze its hash, then later on attempt to download it again but get the new native wheel with different hash... Maybe I need to change it such that I build all wheels and sdist packages from a single CI (the new fonttools-wheels repository) instead of a couple here and the rest over there, that way I can upload all in one go.
Hmm. I haven't published anything on pypi before, but is there maybe a pre-release of sorts that let's you not directly publish the newest version and only when all wheels are built and uploaded you can publish the release?
Maybe we don't need to have a separate repository just to have a distinct CI setup, I think I can add a new Github Actions yaml file in this same repository and set it up such that it does not run at every push but only upon git-tagged commits. That way I don't make the regular CI slower (it takes almost an hour to build all of them!). I think I'll do that before I make the next release.
pre-release is a different release (e.g. 4.39.5b1) that pip will not download unless requested with --pre option; you'd still have to make a new final release following that one, which will be distinct from it. The problem is I want to do the upload in one go, not at different times, so I need to do it from the same CI run instead of split into two with a lag of time between them. I think I know what to do, thanks.