Added CI for pre-built wheels
Hi @davisking and the team,
First of all, I would like to thank you for your work on this library. It has been incredibly helpful for my projects.
However, there's one feature I feel is missing: pre-built wheels. It would be really convenient to install the dlib library without dealing with build dependencies. I’ve noticed that there are several forks or related repositories that set up CI pipelines to build wheels and upload them to PyPI under different names. The issue is that these forks are hard to maintain and often become outdated.
One such example is https://github.com/alesanfra/dlib-wheels, created by @alesanfra. I used this as a basis for my pull request, where I adapted the CI pipeline with slight modifications to make it compatible with the original dlib repository.
An additional benefit of this PR is that the CI pipeline now runs tests across different Python versions (currently 3.7 to 3.12) and various OS/architecture configurations (Linux, Windows, and macOS, including both Intel and AMD64).
The only aspect I am uncertain about is disabling GUI support (DLIB_NO_GUI_SUPPORT=ON). My preference is to have wheels without GUI support to reduce their size, but if you prefer to provide pre-built packages with GUI support, I can modify the setup to build wheels both with and without GUI support.
The final step, which is not included in this PR, is setting up automated publishing to upload the wheels to PyPI when a release is created. This could be added in a future PR if you decide to further automate the process.
Thank you in advance for your feedback and comments!
We can do this, but I just don't want to be the one to deal with it :)
So I would put it in a separate repository. There are just too many build options for people's platforms. Lots of people use the gui stuff, or use AVX or NEON instructions, or use cuda. Probably most dlib users use one of those and these binaries don't include those features. It would just turn into being asked for those too.
We can do this, but I just don't want to be the one to deal with it :)
So I would put it in a separate repository. There are just too many build options for people's platforms. Lots of people use the gui stuff, or use AVX or NEON instructions, or use cuda. Probably most dlib users use one of those and these binaries don't include those features. It would just turn into being asked for those too.
This is why I don't quite understand how Conan can be maximally useful. Like the default builds of ffmpeg or onnxruntime on there are never gonna be what I want.
Hi everybody,
As mentioned by @SfinxCZ I'm the maintainer of the repo https://github.com/alesanfra/dlib-wheels, which builds this distribution of dlib: https://pypi.org/project/dlib-bin.
We had similar discussion a few years ago, you can find it in this PR #2338, where we agreed to keep the repos separated. If anyone is interested in building a different set of wheels I'm more than happy to accepts PRs. Also if anyone is interested in helping me maintaining that repo in order to have faster releases I'm open to accept new maintainers, just reach out to me privately.
Sweet, that's the best way :)
Like one thing that is annoying is you can't do pip install dlib and have that grab precompiled binaries, since most dlib uses really do want some of these special options. And if pip install dlib did that the main dlib project would be flooded with people asking stuff like "why isn't dlib using the GPU?" or "why is it so slow?" etc.
Ah, I missed that issue—I'll be more thorough next time. :)
If you prefer to keep the two repos separate, that's perfectly fine. However, before you decide to close this MR, I'd like to share a few comments and suggestions.
As I mentioned in my initial comment, the benefits of this change extend beyond building binary wheels for distribution via PyPI. Another key advantage is that the pipeline would run tests in different environments and on various architectures. I've noticed several issues related to running the dlib package on macOS, as well as other challenges. While it's likely that some issues will still arise even with this pipeline in place, having an additional step to verify that the code works across all supported OSes and architectures could significantly enhance the user experience and reduce installation friction.
Additionally, if you have a similar or identical pipeline in both this repo and the dlib-bin repo, you'll be able to immediately identify if an issue in one pipeline (such as changes to CMake, adding new parameters, or refactoring setup.py) affects the other. This would allow you to address such issues right away.
In essence, this approach would make binary distribution a first-class citizen, even if pip install dlib continues to install from the source code.
Here are my humble suggestions:
- Add this pipeline to the
dlibrepo. - @alesanfra could make the
dlib-binPyPI project accessible from this repository. - I can add an additional step to build the binary wheels with the correct
dlib-binnaming conventions (I would likely reuse some code from @alesanfra's repo). - You would have both the source code with
dliband the binary wheels withdlib-binavailable for every release. If you'd like to automate further, I can implement changes to upload these files directly to PyPI when a release tag is created. - Mention
dlib-binas the official binary repo in the README file
This way, you could retain the pip install dlib command as is, while ensuring that pip install dlib-bin remains up to date.
Testing on more platforms is fine. I just don't like how complicated and magical looking it is though. E.g.
MATRIX_INCLUDE=$(
{
cibuildwheel --print-build-identifiers --platform linux --arch x86_64 | grep cp | jq -nRc '{"only": inputs, "os": "ubuntu-latest"}' \
&& cibuildwheel --print-build-identifiers --platform macos --arch x86_64 | grep cp | jq -nRc '{"only": inputs, "os": "macos-13"}' \
&& cibuildwheel --print-build-identifiers --platform macos --arch arm64 | grep cp | jq -nRc '{"only": inputs, "os": "macos-14"}' \
&& cibuildwheel --print-build-identifiers --platform windows --arch AMD64 | grep cp | jq -nRc '{"only": inputs, "os": "windows-latest"}'
} | jq -sc
``` is a bunch of string parsing and depending on particular string outputs that inevitably change. And then it falls on me to puzzle out why some depenency changed and how to fix the CI.
So if this stuff can be done with simple explicit clear commands, like the current `build_python.yml` that would be better. Also makes it more clear what it is doing.
Also don't disable the gui :)
I've changed the matrix definition. Its explicitly defined instead of using the cibuildwheel command.
One more question regarding the gui. If someone does not want to use the GUI functionality, is it still possible to install and import the dlib library in environments where the gui dependencies are not installed? E.g. docker image without gui, etc.
Warning: this issue has been inactive for 40 days and will be automatically closed on 2024-11-14 if there is no further activity.
If you are waiting for a response but haven't received one it's possible your question is somehow inappropriate. E.g. it is off topic, you didn't follow the issue submission instructions, or your question is easily answerable by reading the FAQ, dlib's official compilation instructions, dlib's API documentation, or a Google search.
Notice: this issue has been closed because it has been inactive for 45 days. You may reopen this issue if it has been closed in error.