conda-forge.github.io icon indicating copy to clipboard operation
conda-forge.github.io copied to clipboard

Upper bound on noarch python packages

Open hmaarrfk opened this issue 3 years ago • 4 comments

Your question:

I would like to suggest we start adding an upper bound on noarch python packages.

The reason is that the python standard library has been in flux for a while and this is causing packages that declare themselves as noarch: python to have to retroactively apply repodata patches when a new version of python comes out.

It would be great if in addition to the lower bound, we also had infrastructure that would trigger a rebuild for python packages that are noarch when a new version comes out.

The lastest problem we faced is: is with:

  • https://github.com/googleapis/proto-plus-python/pull/329/files#diff-11135ba0cd87468dca27e3a55d229e49647ff84f55d04301239ead576bd25743R61 which is included in their 1.22.2 release https://github.com/googleapis/proto-plus-python/releases/tag/v1.22.2

which caused problems for: https://github.com/conda-forge/google-cloud-pubsub-feedstock/pull/54

It would have been great if versions of proto-plus-python were limited, by our infrastructure, to 3.6->3.10 until a rebuild is explicitly triggered for 3.11 putting the responsibility in the package maintainer's hands.

hmaarrfk avatar Jan 21 '23 21:01 hmaarrfk

Do we actually know that the majority of noarch packages are NOT forward compatible?

dopplershift avatar Jan 23 '23 20:01 dopplershift

Think this would cause a lot of churn to constantly move these upper bounds. While it may catch some edge cases, it would also create a lot more traffic from users requesting the latest Python version, which also seems undesirable

jakirkham avatar Jan 23 '23 20:01 jakirkham

The thing is, that the current solution, of using "repodata patches" can only be described as reactive.

It is largely untestable, and we are adding more and more complexity to the review process of these patches. We've also found a way to "centralize" management of "fixes" where core is more involved in helping downstream packages correct incompatibilities from future packages that they have no control over.

Given a large enough set of dependencies, one can almost guarantee that code that was written for python 3.5, won't run on python 3.12 given the large number of deprecations over the years.

Given that we have python 3.11 today, it may be that the correct upper bound is "3.12", but I don't think it should be "totally unbound by default".

hmaarrfk avatar Jan 28 '23 15:01 hmaarrfk

Sure, but, again, I have yet to see any evidence provided for how wide a problem this is. I maintain plenty of packages, and have yet to encounter it with Python versions. That doesn't mean it's not happening, but it's all speculation at this point; there's no data to try to understand what the cost/benefit ratio is. I don't at all agree that it's remotely guaranteed that every package is going to be affected by some deprecation between 3.5 and 3.12.

We're now looking at 12-month Python version releases, and one of the keys to that being sustainable is having so many noarch packages that don't need to be manually upgraded.

dopplershift avatar Feb 02 '23 02:02 dopplershift