Run wheel builds selectively?
Given the wheel CI builds take a while, wonder if we could limit how often we run these. For example:
- Only run if certain files (like
setup.pychange) - Run only when requested (like with a PR label)
- Run only on
mainand/or tags - Test some subset of wheel builds and only test all in specific contexts
- ?
We need not do all or any of these. They are just some thoughts.
Curious what people think 🙂
cc @henryiii (in case you have thoughts on what others do typically or on ways we might speed up the builds)
Another option is cron, but for my part I'll add that I'm not great at keeping up with non-PR action failures. (And cronjobs get disabled after 6(?) months with no actions)
Yeah and issues can pile up if not run regularly
Just trying to find a balance since it seems atm we are waiting 2hrs for the Linux wheel build, which is kind of long
I usually only run via workflow_dispatch or {github releases|tags} (depending on the project - I like releases, but sometimes tags are needed or better). For main / stable workflows, I'll have wheel builds on stable only. I'll sometimes add a very small subset of runs (one wheel each on each OS, for example) into the normal testing workflow. Generally wheel building is pretty expensive to serve as your main testing, traditional testing is better - (full) wheel builds should target releases. For example, see https://github.com/scikit-hep/boost-histogram/blob/339306d388d7f112afb4e6f619c52ab8252c4f6a/.github/workflows/tests.yml#L89-L114.
And the full build is at https://github.com/scikit-hep/boost-histogram/blob/339306d388d7f112afb4e6f619c52ab8252c4f6a/.github/workflows/wheels.yml
(this is also another reason to use configuration like https://github.com/scikit-hep/boost-histogram/blob/339306d388d7f112afb4e6f619c52ab8252c4f6a/pyproject.toml#L74-L86 instead of putting everything in environment variables)
Thanks for the pointers here Henry! 😄
Looking more at the builds here. It seems we are covering other architectures that probably require emulation (like ppc64le and aarch64) in CI, 32-bit builds (Idk that we have tested there on CI or otherwise outside of the wheel builds) as well as other libc implementations (like musl).
Given the first 2 likely require emulation, we haven't actively tested on 32-bit, and there probably isn't much changing in Numcodecs that would be libc implementation sensitive, wondering if we could restrict regular builds to x86_64 with glibc and then only run the others when releasing. If we want additional coverage for things (like ppc64le and aarch64), would suggest additional jobs (outside of wheels) be added for them (ideally without emulation for better overall CI runtime).
Thoughts? 🙂
One way to provide optionality here would be to allow labeling of PR like what cloudpickle does ( https://github.com/cloudpipe/cloudpickle/pull/339 ). This would allow running additional tests when relevant.
Going to start trimming the build matrix. Submitted PR ( https://github.com/zarr-developers/numcodecs/pull/320 ) to do that. Will evaluate based on that whether more trimming or other changes are needed.
A path check like this one in Dask would also work