pyyaml
pyyaml copied to clipboard
platform-independent wheels
Related to #43 . A platform-independent pure-python wheel on pypi would be very useful for dependency resolution in linux projects.
PyYAML-?.??-py35-none-any.whl
Ultimately a manylinux wheel using libyaml would be ideal, but I understand the complications so this might be a fair compromise that's simple to implement.
I concur, it would be great to have a platform-agnostic wheel as fallback when no platform-specific wheel is available for the platform.
c.f. https://github.com/yaml/pyyaml/issues/43
The shared library is a binary compiled against certain arch and python version. there's no standard for shipping all of them in one wheel. That's why manylinux1 wheels exist. The project ships a whole bunch of them and then client (pip) selects which one to download and install.
There's also sdist (source python package distribution), it's a .tar.gz-packed source with build scripts and some metadata. It's not pre-compiled at all.
So your request is either duplicate of #43 or I don't understand what you want.
"generic" wheels are only shipped for projects with no binary dependencies, which is not pyyaml's case at all.
I think what's being asked here is to produce generic wheels that don't use the LibYAML bindings and are therefore not platform-specific, as a fallback for when no more specific (eg. manylinux) wheels are available. This would be useful for packaging tools and distribution pipelines that only support wheels and can't build source distributions.
Oh, so pure-python implementation, then? This would probably not make sense once manylinux1 wheels are out...
I think it would still make sense because not every possible platform is covered even when manylinux wheels are available. So the generic wheels would act as a fallback.
@webknjaz to elaborate on @rdb , if --without-libyaml is used (and you do not have Cython installed when building), the result is a pure-Python wheel.
$ pyenv local 3.8.3
$ python3 -m venv venv && . ./venv/bin/activate
$ python -m pip install -U wheel setuptools
$ python setup.py --without-libyaml -q bdist_wheel
$ ls -1 dist
PyYAML-5.3.1-py3-none-any.whl
Note that this is not a py2.py3 universal wheel because of the existence of lib/ versus lib3, so the result would be two pure-Python wheels.
See some comments from https://github.com/yaml/pyyaml/pull/407#issuecomment-639217354 on this - while it is possible to build a pure-Python wheel with --without-libyaml bdist_wheel, I'm wondering if that would have the unintended side effect of users unknowingly pulling down the pure-Python wheel from PyPI when they had previously fetched the sdist and build with libyaml bindings by default.
@bsolomon1124 it might work with all the pure python bits going in yaml-slow and have yaml depend on yaml-slow and add only the binary accelerated parts
In pyodide you can micropip.install() a package only if it has a pure python wheel.
I want to install a package that requires pyyaml and are stumbling over the fact that pyyaml does not have a pure-python-wheel available for download on PyPi.
I think most of us are still of the opinion that the presence of a pure-Python wheel under the pyyaml package on PyPI causes many more problems than it solves. Users on platforms without binary wheels that have forever silently built the libyaml extension from the sdist would suddenly and silently be forced onto a slow pure-Python version on their next upgrade, with the requisite performance penalties and/or import failures, depending on how "hard" they depend on the extension.
I'd argue that splitting the package into separate Python + extension packages doesn't really solve the problem either; it's a matter of deps and discoverability, and a whole lot of new added complexity with two very tightly-coupled packages that arguably aren't very discoverable. Someone's getting hosed depending on what the dependent package's requirements specify, and whether pyyaml defaults to libyaml or not; any existing package that just depends on pyyaml isn't going to "just work" under something like pyodide- someone would have to know to switch the dependency to the pure-Python subpackage. If we went the other way, anyone that didn't explicitly install pyyaml_fast is now slow and/or broken. A completely standalone pure-python-only package would solve some of those problems, but introduces others, since Python packaging doesn't really handle top-level package conflicts (ie, now we have two things providing a yaml top-level package; path ordering becomes a complicating factor, bleh).
The needs for pure-Python wheels in pyyaml have so far been niche enough that it seems like the problem needs to be solved elsewhere (private index, upstream fixes to those envs to allow install from an sdist, whatever)- the likelihood of breaking a huge swath of users is just too great to accommodate it on PyPI IMO.
Recently twisted split their binary and pure python wheels and the feedback has been great so far
Give it a few releases for breaking API/ABI changes on the native extensions and people that, eg, have the pure-Python part installed in an OS package and the native part installed with --user- there's a special hell to sorting out mismatched bits there, and making the runtime resilient to it. We had similar problems with pyyaml recently when we moved the top-level _yaml package that hosts the extension into a subpackage; anyone that was doing import _yaml to probe for the presence of the extension was now able to pick up a stale version of the old top-level package and extension from another place on their path that tries to load the new version of the pure-Python bits.
I'm not saying it's impossible, but the !/$ factor seems awfully low for the added complexity to build/test/release and the new potential runtime hassles it exposes (especially since there are no $ involved and all of us are working on this out of selfish desire to keep it working for our own needs ;) ).
How about trialing the pattern in a wholly new project namespace like psycopg3 are doing?
I use yaml for config files mostly. Not performance critical at all. Maybe json is a better choice. And json is in Python's standard library. Added bonus: good documentation. When I look at https://pyyaml.org/wiki/PyYAMLDocumentation occasionally, I'm bewildered.