pygmo2
pygmo2 copied to clipboard
The PyPI site doesn't contain the source tarball
https://pypi.org/project/pygmo/#files
I am not familiar with PyPI best practices. Are we supposed to bundle the source within the existing binary wheel or should we have a separate package for the source?
No, there should be a regular tarball uploaded there. You can take a look at other packages there,
I do not understand, though, what should the tarball contain?
Tarball should contain the same directory that has setup.py
at the top.
I do not hink this model applies to pygmo. pygmo are bindings of a C++ project with many third party libraries linked to it and the setup.py (wheel_setup.py), is here: https://github.com/esa/pygmo2/tree/master/tools with the various CI files and tools.
It would not make much sense to tarball this directory.
Nor to tarball the directory structure that is created during the various CIs and that is ultimately creating the wheel, as that structure, without the various libraries (.so, ,dll .dylib) used by pygmo and platform dependent, is not very useful.
The pagmo
project is pre-installed, no need to bundle it.
But I am not insisting, this isn't too important. Please close it if you feel so.
Closing this as the pybind11 library (.so) cannot be packaged as source as I understand this suggestion. A binary wheel must be created instead as done now again #117.
Closing this as the pybind11 library (.so) cannot be packaged as source as I understand this suggestion.
pybind11 library (.so) is pre-installed and is not meant to be packaged by pygmo2.
I don't understand how does this prevent tarball distribution.
If we run also python setupy.py sdist
the tarball will contain a pygmo.so
compiled for a specific architecture only and linking to specific library names (and version).
When a user would then use such a tarball in his system, the library will likely be incompatible as the architecture may be different. And also in case we would produce one tarball per architecture, the libraries linked to the pygmo.so
will have differen version in the final host system.
This is infact the issue solved by producing binary wheels and repairing them so that all libraries are also packaged in the binary file and will be compatible.
@darioizzo The source tarball is not supposed to contain any binaries. It should only contain the code that is sufficient to rebuild binaries, not binaries themselves.
Ok but is that not available already in github? like here: https://github.com/esa/pygmo2/releases .... I might be confused, but are you suggesting to put that tarball also in PyPi?
That's what most projects are doing, for example https://pypi.org/project/pybind11/#files
https://github.com/esa/pygmo2/releases is auto-generated from git on the fly by GitHub.
I am not familiar with best practices in PyPi, so I trust you there. So it would be enough (during a release build) to tar the cloned pygmo2 repo and then upload it via twine to PyPi?
If you confirm, @yurivict we can easily do that for the next releases .....
@yurivict in order to produce working sdist
packages we need to use something like scikit-build
https://scikit-build.readthedocs.io/en/latest/index.html
with which I had some success in other CMake-based projects. Unfortunately, with the pending removal of distutils in Python 3.12, scikit-build
is being deprecated in favour of
https://github.com/scikit-build/scikit-build-core
which however is not ready yet. Thus, we plan to add sdist
support when the waters have calmed a bit.
If you confirm, @yurivict we can easily do that for the next releases .....
Yes, a simple tarball of the cloned directory is what people put there.
To summarize:
@bluescarni solution (using sdist properly in connection to scikit-build-core used in setup.py would ensure that after typing something like pip install pip install pygmo2-sdist.tar.gz
the pygmo2 code is compiled locally on the user machine (provided the user has installed all dependencies, including pagmo). For this we must wait September when the scikit-build-core project is stable and usable long term)
@yurivict solution only uploads the cloned repository. I could add this easily also to the current CI manylinux pipelines. The user would then have to manually untar and call cmake; make; install in a build directory.
Frankly though, I fail to see much use to both solutions as users who must compile the code are anyway better off doing so checking the github repository.
added as per @yurivict suggestion. should appear in the next release.