timezonefinder
timezonefinder copied to clipboard
Package C extension to wheel
Existing published wheels do not include the native C extension inside_polygon_ext
. The default distribution could be quite slow if the library consumer doesn't install Clang, GCC, and ffi-dev to build dependencies. To clarify, it doesn't relate to Numba support.
Added an auditwheel
check to the GitHub Actions check extension compatibility with manylinux specification. It's probably needed to adjust it based on auditwheel
check.
How to verify check C extension support: assert(TimezoneFinder.using_clang_pip())
. Checked it on a local project with a private PyPi repository.
Thanks for your contribution.
I have some questions:
- you included
*.so
files, even though only.x
and.h
files are under version control (https://github.com/jannikmi/timezonefinder/tree/master/timezonefinder/inside_poly_extension). Please explain how this helps. At which points are the.so
files being created? - in which case does the "Audit wheel for manylinux compatibility" step of the GHA workflow fail? (
auditwheel show
command) - is it required to run
auditwheel repair
? https://pypi.org/project/auditwheel/
I see some remaining TODOs:
- [ ] separate build test job (independent of the publish/deploy job) in the GHA workflow
- [ ] add Changelog entry
- [ ] bump package version number
Thank you for the review.
-
The binary files are packaged to the resulting binary wheel during
poetry build --no-interaction
in the Deploy workflow. If the proposed string is omitted, only source files will be included. -
Speaking of
auditwheel
, it's not enough to rely only on exit code since the diagnostics are not machine readable. So the repair step is usually needed. -
Since current binary build is based on ubuntu-latest, the result build won't be manylinux-compatible. I've built my version against
quay.io/pypa/manylinux2014_x86_64
docker image which contains many prebuilt python interpreters, so I was able to produce per-Python version wheels liketimezonefinder-6.5.0.post100-cp311-cp311-manylinux_2_17_x86_64.whl
for 3.11. -
Test and deployment workflows should be separated. To isolate complexity of auditwheel, repairing wheels, multiple python versions and auto-testing them, I suggest to use a PyPa-provided solution cibuildwheel which allows to add a GHA build step wrapped in build matrix to automatically produce multiple binary wheels. A good example is here. I can draft such workflow in this PR.
@adamchainz @ringsaturn @joshuadavidthomas Can anyone jump in an help me out here? I am currently too busy to read into this, but I don't want to leave this valid PR pending. Thanks a lot
@adamchainz @ringsaturn @joshuadavidthomas Can anyone jump in an help me out here? I am currently too busy to read into this, but I don't want to leave this valid PR pending. Thanks a lot
Thanks for reaching out, however I'm too busy to handle this. I suggest to get in touch with https://github.com/hugovk who help ujson's development a lot https://github.com/ultrajson/ultrajson/issues/343.
I can make an initial implementation of cibuildwheel next week, but it will be complicated to test it properly.
@adamchainz @ringsaturn @joshuadavidthomas Can anyone jump in an help me out here? I am currently too busy to read into this, but I don't want to leave this valid PR pending. Thanks a lot
@jannikmi Appreciate the ask, and I wish I could lend a hand, but I have only ever worked with pure Python packages and have never written a line of C code in my life. I'm not sure how much help I can be. 🫤
I can make an initial implementation of cibuildwheel next week, but it will be complicated to test it properly.
Thanks. I am sorry this topic is too far from my area of expertise that I cannot be of much help.
What would you need to test the changes?
I can make an initial implementation of cibuildwheel next week, but it will be complicated to test it properly.
Thanks. I am sorry this topic is too far from my area of expertise that I cannot be of much help.
What would you need to test the changes?
Sorry for the delay - I've got sick. I added support for cibuildwheel. Now the pipeline will create sdist (no binaries inside) and a bunch of binary wheels with a prebuilt clang-pip extension for each python version. All of them are uploaded to PyPi.
I've added a temporary stage verify
to just check what will be published to PyPi since the actual deploy
step is executed only for a master branch.
See the experimental setup here
Thanks a lot for your work!
Unfortunately i had to disable the verify
step, because it fails in PRs from fork repositories.
Any idea how to solve this? It would be nice to test the publishing steps even in "external" PRs
Thanks a lot for your work!
Unfortunately i had to disable the
verify
step, because it fails in PRs from fork repositories. Any idea how to solve this? It would be nice to test the publishing steps even in "external" PRs
Yes, it doesn't allow to publish even to TestPyPi from PRs. It's a proper security measure, I think.
Since we've verified that all files are gathered by the last step, you can easily drop verify step, it is not needed anymore.
We need to drop "test-publish" and a dependency to this step needs: [test, make-wheels, make-sdist, test-publish]