pyreadiness
                                
                                 pyreadiness copied to clipboard
                                
                                    pyreadiness copied to clipboard
                            
                            
                            
                        Can pyreadiness look at more than the classifiers?
The JSON API includes the requires_python metadata for each package.  If that were included, more packages would be marked as ready for 3.10 (for example).
As one data point, pip will happily install Jinja2 into 3.10, but Jinja2 is marked as not ready for 3.10 because it doesn't mention the 3.10 classifier.  It has requires_python: ">=3.6" and so could be marked as ready.
Thanks for the issue! Yes, this should be possible, and I would merge a PR that adds this.
What about packages which have an open-ended requires_python: ">=3.6" saying 3.6+ but don't yet actually support 3.10 (or it's unknown)?
Here's my reasoning for using only python_requires in Pallets. With classifiers, I have to add a new line and make a new, potentially empty release, just to indicate that I support the new Python. With python_requires, I make a new release only to fix compatibility if needed, which is rare. I would also have to do potentially multiple releases, if I want to indicate that all the previous 2.Y-1, etc. releases also supported the latest Python.
It seems that other projects have come to the same conclusion, that keeping the classifiers up to date is noisy/busy work that doesn't really benefit actively maintained/compatible projects.
We ran our tests against 3.10.0rc months ago and addressed any test failures then. I don't think we actually had to make any new releases, just fix some tests.
It's sort of weird though, because technically, python_requires >= 3.6 doesn't indicate support the same way a classifier would. If a project becomes unmaintained and does use something that gets deprecated/breaks after a few releases, it would still appear to "support" the latest version. But at that point you get into trying to determine if a project is still maintained, run its tox config against the given Python version, etc.
I do see at the top of the site that it says "don't explicitly support Python 3.10" (emphasis added). To indicate the potential inacurracy of checking python_requires, maybe the site should have more than green/not color.
- green: explicit support detected via classifier
- green with ///shading: implicit support detected viapython_requires. The color key at the top could mention that packages with this color will probably have support if they were actively maintained around the time of release.
- white: unknown support, missing classifier and python_requires
Are there any projects in the top 360 that are known to be unmaintained and break with 3.10? Perhaps a list of (name, version) pairs could be maintained that will override detection and mark them as red/unsupported on any version page >= the given one. The policy could be to not accept a PR to the list until X months after release, to allow projects time to update. If a project does become maintained again, a PR can remove it from the list.
green with /// shading: implicit support detected via python_requires. The color key at the top could mention that packages with this color will probably have support if they were actively maintained around the time of release.
This makes sense to me.
Are there any projects in the top 360 that are known to be unmaintained and break with 3.10?
Seems unlikely, but it is hard to gauge whether a project is maintained or not.
It seems that other projects have come to the same conclusion, that keeping the classifiers up to date is noisy/busy work that doesn't really benefit actively maintained/compatible projects.
312 of the top 360 have a 2.x or 3.x classifier, 87%.
Are there any projects in the top 360 that are known to be unmaintained and break with 3.10?
Yes, isodate is one. There are others (~I think~ in the top 360) which still test using nose (nose is unmaintained and doesn't work with 3.10), so it's still unknown if those projects support 3.10.
To get an idea of how many packages don't yet support Python 3.10, I tried installing them all with pip. (Ubuntu pt1, pt2; macOS pt1, pt2; Windows pt1, pt2.)
Looking at the 319 packages not explicitly declaring support for Python 3.10 (on Sunday), these failed to install.
Ubuntu:
- pywavelets
- h5py
- pandas-gbq
- tensorflow
- tensorflow-addons
- pyarrow
- tensorflow-data-validation
- tensorflow-serving-api
- tfx-bsl
- datalab
- scikit-image
- tensorflow-model-analysis
- tensorflow-transform
- backports-zoneinfo
- azureml-dataprep
- numba
- azureml-core
- llvmlite
- torch
macOS:
- pandas
- pyarrow
- matplotlib
- scikit-learn
- tensorflow-serving-api
- mlflow
- lightgbm
- seaborn
- pywavelets
- scikit-image
- gensim
- tfx-bsl
- pandas-gbq
- tensorflow-transform
- tensorflow
- tensorflow-addons
- tensorflow-data-validation
- tensorflow-model-analysis
- datalab
- h5py
- xgboost
- backports-zoneinfo
- statsmodels
- azureml-dataprep
- numba
- azureml-core
- llvmlite
- torch
Windows:
- ipywidgets
- jupyter
- h5py
- gensim
- lxml
- nbclient
- notebook
- pandas
- psycopg2
- psycopg2-binary
- pandas-gbq
- pyarrow
- pywavelets
- scipy
- scikit-image
- lightgbm
- tensorflow
- tensorflow-addons
- seaborn
- tensorflow-data-validation
- tensorflow-serving-api
- datalab
- terminado
- tfx-bsl
- tensorflow-transform
- widgetsnbextension
- snowflake-connector-python
- tensorflow-model-analysis
- xgboost
- scikit-learn
- azure-identity
- msal-extensions
- mlflow
- nbconvert
- backports-zoneinfo
- statsmodels
- azureml-dataprep
- numba
- azureml-core
- llvmlite
- torch
In total, 42 unique packages.
Of these 42: 27 have requires_python which, if we use the proposed metric, means we'd say they are ready for Python 3.10 when they aren't. Only three explicitly prohibit 3.10 via requires_python:
- azure-identity
- azureml-core >=3.6,<3.9
- azureml-dataprep
- backports.zoneinfo >=3.6
- datalab
- gensim >=3.6
- h5py >=3.7
- ipywidgets
- jupyter None
- lightgbm
- llvmlite >=3.7,<3.10
- lxml >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, != 3.4.*
- matplotlib >=3.7
- mlflow >=3.6
- msal-extensions
- nbclient >=3.6.1
- nbconvert >=3.7
- notebook >=3.6
- numba >=3.7,<3.10
- pandas >=3.7.1
- pandas-gbq >=3.7
- psycopg2 >=3.6
- psycopg2-binary >=3.6
- pyarrow >=3.6
- PyWavelets >=3.5
- scikit-image >=3.7
- scikit-learn >=3.7
- scipy >=3.7,<3.10
- seaborn >=3.6
- snowflake-connector-python >=3.6
- statsmodels >=3.7
- tensorflow
- tensorflow-addons
- tensorflow-data-validation >=3.6,<4
- tensorflow-model-analysis >=3.6,<4
- tensorflow-serving-api
- tensorflow-transform >=3.6,<4
- terminado >=3.6
- tfx-bsl >=3.6,<4
- torch >=3.6.2
- widgetsnbextension
- xgboost >=3.6
Trusting an open-ended python_requires would declare that these 27 packages are ready for Python 3.10 when they don't even install for Python 3.10.
I can imagine someone wanting to work with pandas, SciPy or TensorFlow checking this site and seeing it green for 3.10, but then confused why it won't even install.
And successful installation doesn't necessarily mean successfully running on Python 3.10. I expect more outside this list have runtime incompatibilities as well, such as isodate.
There are other metrics and heuristics that could be added. For example, check if a wheel is available for the given Python version exists, and treat that the same as a classifier. Skimming the list, it looks like most that don't install would probably have other install requirements, such as compilation toolchains. I'd also treat having no platform wheels and only a generic wheel as satisfying that metric.
I'd like to hear more ideas for solutions to the issue. I strongly prefer not having to update classifiers on my projects, for the reasons stated above. So they will forever be listed as "not Python 3.X ready" under the current metric, which is unfortunate.
FYI, the @pygame_org twitter account is using this list based on classifiers to argue to the PSF that it should not promote .0 releases (or maybe even .1 releases) of Python https://twitter.com/pygame_org/status/1584872593597042688 . (And they blocked me for disagreeing that this release is not just like like last ones)
My suggestion: mark any packages that do not have minor version classifiers in a different color. Also check for wheels, and anything shipping 3.11 binary wheels get a green color too (I think this is easier with the json API? Haven't checked). This still misses some Python libraries, but the colors + check for wheels would really help.
Some examples that are still not 3.10 compatible according to the current logic: setuptools, six, importlib-metadata, zipp, click, flask, beautifulsoup4, and more.
FYI, you can't use Requires-Python metadata for this, because it was not intended[^1] to be capped, and causes issues if it is. It was only designed to allow authors to drop support for old releases, not limit support for new releases. If you don't support Python 3.12, you can't add <3.12 to Python-Requires, it just causes back solves to older uncapped releases. Numba, which never supports future Python versions, had to remove this as a method of capping.
[^1]: At least it was never implemented.
Here's my concrete suggestion: Three colors: Green + checkmark for N classifier or N binary wheel. Red + X for neither of these but N-1 classifier or N-1 binary wheel present[^2]. And white + question mark otherwise. Didn't check to see if any of those have ABI3 wheels, those are harder to place.
14 packages[^1] would be added by including wheels in addition to classifiers for 3.11. Only 25 (total) packages provide 3.10 wheels but not 3.11, compared to 36 that provide 3.11 wheels.
This doesn't turn packaging, setuptools, etc. green, but it at least fixes 14 of them and adds two layers of color.
I can implement that if it sounds reasonable.
[^1]: pyyaml, pandas, cffi, aiohttp, lxml, greenlet, scipy, cython, kiwisolver, ruamel-yaml-clib, zope-interface, pyzmq, torch, shapely [^2]: This still often means a package just forgot to update in the case of classifiers. It's much more of a real "red x" if there's no wheel. So maybe this could be red / light red?
I made this chart that shows the number of packages that support a particular Python version based on binary wheels' filenames (limited to 3.9-3.11 - I wanted a visual aid to help think about whether to upgrade a project I'm involved with to 3.9 or 3.10). The discussion here was helpful for this, thank you.
Caveats & notes: The underlying script makes use of the abi3 tag when it's present. I didn't know about this tag before reading this thread, and so how the ABI tag is used may be incorrect or insufficient. The underlying data is from something like two weeks ago. The chart doesn't auto-update at all.
Red + X for neither of these but N-1 classifier or N-1 binary wheel present
For the chart, I adjusted this condition to "neither of these but a classifier for a different minor version present or a binary wheel for an earlier minor version present". Although the numbers are very few, some packages, like backports.zoneinfo, legitimately do not support the last few Python versions, and if we look at only the previous version (N-1), they get classified as 'maybe' / 'white + question mark' rather than 'no' / 'red + x'.