Bump GitHub Action pypa/cibuildwheel
This should help fix today's CI failures.
Edit: Actually, it doesn't help. Quay is currently down with 502 errors (confirmed by status.redhat.com).
Still, it's a good idea to update!
Unliek the previous version, the new version of pypa/cibuildwheel attempts to build free threading wheels, and fails. We do need to fix that failure, but for now disable free threading.
I created issue #212 to address free threading compatibility issues.
It looks like the pkg-config executable is expected and missing on Windows. Not sure how this wasn't the case before...
After bumping cibuildwheel, CMake cannot find the pkg-config executable on Windows:
CMake Error at C:/Program Files/CMake/share/cmake-4.1/Modules/FindPackageHandleStandardArgs.cmake:227 (message):
Could NOT find PkgConfig (missing: PKG_CONFIG_EXECUTABLE)
Call Stack (most recent call first):
C:/Program Files/CMake/share/cmake-4.1/Modules/FindPackageHandleStandardArgs.cmake:591 (_FPHSA_FAILURE_MESSAGE)
C:/Program Files/CMake/share/cmake-4.1/Modules/FindPkgConfig.cmake:108 (find_package_handle_standard_args)
CMakeLists.txt:35 (find_package)
I cannot find a way to fix this error, even after finding these:
- Error:Could NOT find PkgConfig (missing: PKG_CONFIG_EXECUTABLE)
- How to install pkg config in windows?
This might an issue with scikit-build-core.
The pkg-config installation did not seem to be picked up
Here we go!
Running before_all...
+ choco install -y --no-progress --no-color cmake>=3.28
It seems the build.yml file that you were editing does not correspond to the shared_build that is being run there.
As for if it is needed or not, it seems that the PKG_CONFIG_PATH is already populated pointing to the python library. Maybe check if it is installed there or if you should be installing there instead of choco.
I understand that the pkg-config executable itself is missing.
I can see that PKG_CONFIG_PATH is defined and points to the location of Python .pc files (C:\hostedtoolcache\windows\Python\3.12.10\x64/lib/pkgconfig) but that environment variable doesn't seem related to the availability of the pkg-config executable.
I'll try build.yml → verify_shared_build.yml as suggested.
I can see that
PKG_CONFIG_PATHis defined and points to the location of Python.pcfiles
Ah right, I confused it for the PKG_CONFIG_EXECUTABLE. Hmm, indeed making that point to the build-isolated path can be quite tricky. One thing that would help is for pkgconfig to expose a project.entry-points."cmake.root" with entry point name PkgConfig (case-sensitive) so that PkgConfig_ROOT is populated with an appropriate hint that could be used.
Otherwise constructing the appropriate hint could be quite tricky, at which point using the "system" pkg-config would be more reliable.
From the scikit-build-core perspective we cannot do much other than hard-coding a hint for that package just in case, which may not be a bad idea. @henryiii you reckon there would be downsides to doing so?
hard-coding a hint for that package
Just to be clear, are you referring to the build dependency https://pypi.org/project/pkgconf/?
hard-coding a hint for that package
Just to be clear, are you referring to the build dependency https://pypi.org/project/pkgconf/?
Originally the pkgconfig, but the same applies for pkgconf. But in that case it would be much easier to make the appropriate hint since everything is under sitelib-packages.
Thanks! I am unfamiliar, do you mind explaining what exactly needs to be done to provide that hint?
@DimitriPapadopoulos I thought of an idea when reading Cristian's comment about how the issue is pip's pseudo-virtual environment used for builds (which the maintainers desire changing but don't have time). Can you please try setting the build-frontend option to build[uv]? https://cibuildwheel.pypa.io/en/stable/options/#build-frontend
I understand that for Conda (if the issue exists there?) we might not want to build with UV but we can at least prevent the main pipeline from having hacks. What do you think?
Here's an example of how to integrate UV into the workflow https://github.com/pypa/cibuildwheel/pull/2630
Tried your suggestion. I don't know enough about this part of the build process to be useful.
Very close! Like I wrote in the example I linked, I think you just need this part i.e. install UV on macOS and Windows:
- name: Install UV
if: runner.os != 'Linux'
uses: astral-sh/setup-uv@v7
Try to add SKBUILD_LOGGING_LEVEL: "DEBUG" env var to get some more debug logs
Could it be that this needs to be updated by testing CMake PKG_CONFIG_EXECUTABLE:
cmake/SetSystemLibIfExists.cmake
function (SetSystemLibIfExists)
set(ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH};$ENV{CONDA_PREFIX}/Library/lib/pkgconfig;$ENV{CONDA_PREFIX}/lib/pkgconfig")
if(CMAKE_SYSTEM_NAME STREQUAL "Windows")
set(PKG_CONFIG_EXECUTABLE "${PKG_CONFIG_EXECUTABLE};--msvc-syntax;--dont-define-prefix")
endif()
pkg_check_modules(VENDORED_AS_SYSTEM_LIB IMPORTED_TARGET GLOBAL ${VENDORED_LIBRARY_PKG_CONFIG}>=${VENDORED_LIBRARY_PKG_CONFIG_VERSION})
endfunction()
This variable should be set by FindPkgConfig(), but I used: find_package(PkgConfig REQUIRED) - getting old is no fun
Argh, windows is always a challenge. The CMAKE_PREFIX_PATH is as we would expect.
2025-10-26 11:17:54,917 - scikit_build_core - DEBUG - SITE_PACKAGES: C:\Users\runneradmin\AppData\Local\Temp\build-env-9dei9xy0\Lib\site-packages
But the executables are installed in Scripts instead of bin so they are not discoverable
set(Python_EXECUTABLE [===[C:/Users/runneradmin/AppData/Local/Temp/build-env-9dei9xy0/Scripts/python.exe]===] CACHE PATH "" FORCE)
Best option is to fix it in pkg-config package. You could add a workaround like
if(SKBUILD AND WIN32)
foreach(prefix IN LIST CMAKE_PREFIX_PATH)
cmake_path(APPEND prefix Scripts pkg-config.exe OUTPUT_VARIABLE test_pkg_config)
if(EXISTS $test_pkg_config)
set(PKG_CONFIG_EXECUTABLE $test_pkg_config)
break()
endif()
endforeach()
endif()
I don't know enough about scikit-build-core or CMake to find where to add the above code.
It might be easier to fix pkg-config :smile: Where is pkg-config downloaded from? conda-forge? Which exact pkg-config package are we talking about?
I don't know enough about scikit-build-core or CMake to find where to add the above code.
The snippet above can fit just before the find_package(PkgConfig).
It might be easier to fix
pkg-config😄 Where ispkg-configdownloaded from?
Seems like I should have said pkgconf, I qm bound to confuse them.
But as I yet again think about this again, it does not really make sense why it failed when you choco install it. Are the choco binaries not installed somewhere that is in PATH? Maybe you could test it outside with a pkg-config call.
And then there is another mystery, is the find_package(PkgConfig) even needed to be a requirement, which afaict it used to either pick a system library or bundle a library, but I do not see the instructions that would actually install the devel files that pkg-config would look for. Am I missing something here @ofek?
@LecrisUT Full transparency: @MementoRC rewrote the build system to be far more maintainable but I am unfamiliar with CMake and haven't had time to read the learning resources that @henryiii shared. I have almost no knowledge of how builds work here off the top of my head without looking deeply.
I don't know enough about scikit-build-core or CMake to find where to add the above code.
You could also combine it in: cmake/SetSystemLibIfExists.cmake, I think this is the only place we reference the missing variable on windows
but I do not see the instructions that would actually install the devel files that pkg-config would look for
The whole verify_shared flow was added because in conda-forge we use the C libsecp256k1 conda package that does have its xxx.pc file and we skip vendoring that Clib within the coincurve conda package.
Did I understand your question correctly? I am not an expert on CMake either, I proceeded by trial/error and once I got something working I cleaned up and PR. There are certainly many opportunities to improve
On another note, the shared flow is not necessary for this repo/artifacts, but we opted to add it to preempt potential issues with the conda-forge recipe (I was indeed more focused on coincurve as a conda package)
You could also combine it in:
cmake/SetSystemLibIfExists.cmake
Not quite. It must be before the find_package(PkgConfig), but that could be moved there also.
The whole
verify_sharedflow was added because in conda-forge we use the Clibsecp256k1conda package that does have its xxx.pc file and we skip vendoring that Clib within thecoincurveconda package.
Ok, that explains verify_conda_build, but not verify_shared_build. If that is meant to test with system dependency outside of conda ecosystem, then I recommend to use the OS native package manager (or closest that there is), i.e. for linux apt/dnf install xxx-devel, for macos homebrew, and for windows vcpkg. Then you do not have to do this dance as CMake actually tends to take into account at least these special environments.
But it would probably make more sense to for now skip these CI, and take a crack at fixing that CI in a different PR, since this has got quite off-topic from the original PR title.
And then there is another mystery, is the
find_package(PkgConfig)even needed to be a requirement
Do you mean that we only need to execute pkg-config.exe and thus could simply find it in the path since it is choco install? correct, but I think that the issue may arise when we build coincurve on conda-forge - I don't remember
but I think that the issue may arise when we build coincurve on conda-forge - I don't remember
But conda CI has a whole different workflow file that doea not fail or use choco or such. It does not cover macos or windows, but it would be better to move them to that file instead in order to get a more accurate build that matches conda environment, i.e. using the packages from conda, not from choco or from pip.
To that end cibuildwheel would not be a good tool to test the conda integration because it uses the PyPI packages, not the conda ones. To be absolutely compatible, you would need to duplicate the build-system.requires into the conda environment file. I have mostly given up on conda packaging apart from maintaining one package, but have they not provided any tools to run conda-forge CI for upstream? Even homebrew provides that functionality these days.
@LecrisUT First, correct, I mixed-up the verify_conda and verify_shared. The latter is effectively to verify that if a user has a custom dynamic libsecp256k1 installed, coincurve works well with it
Agreed, we don't use cibuildwheel in the verify_conda_build:
uses: conda-incubator/setup-miniconda@v3
The whole conda surfaced because of my mix-up, I guess we don't have to consider it
I agree that verify_shared CI should be skipped for windows for the priority of completing the current PR - It is not needed for conda (though the find_package(PkgConfig) maybe, but we use pkg-config rather than pkgconf)
dependencies:
- cffi >=1.3.0
- cmake
- libsecp256k1
- pkg-config
I could investigate this in the recipe, when I get around it
I don't understand what you mean by "conda-forge CI for upstream".
There are scripts in the conda-forge that make the build differ between the build instructions in upstream and downstream. There are automations that allow you to instead mimic the flow one-to-one, homebrew provides it via their github action and you just need to have an equivalent recipe.rb file, and other environments do the same, like Fedora where I am mostly involved.
We build the package using pip install and run the tests on the package installed into a minimal test environment
Note that building with --no-build-issolation is crucial if there are differences between pypi packages and conda ones. Most pure-python packages would not have such difference, but compiled ones do, and cffi is one of them.
The latter is effectively to verify that if a user has a custom dynamic
libsecp256k1installed, coincurve works well with it
Well it is more complicated. Have it installed by what makes a difference. That project can be built either via CMake or via Autotools, in which case it should try to pick up both methods in order. Best thing for those is to just try to build from the major packaging managers and see what they provide and in what form (shared/static library for example). Stuff like choco I do not have experience with, but I do not believe has surpassed vcpkg to make it the default source of windows dependencies, and similarly for mingw and I forgot another one. If you are packaged in those environments, that changes the story though.