python-bigquery-pandas
python-bigquery-pandas copied to clipboard
DISCUSSION: policy for dependency updates that aren't covered by NEP 29
NEP 29, which the pandas community has also been following, has a timeline for Python and NumPy supported versions.
It does not have related recommendations for pandas and certainly not other packages we depend on such as google-cloud-bigquery, google-cloud-bigquery-storage, pydata-google-auth, google-cloud-core, google-api-core, google-auth, google-auth-oauthlib.
If I interpret the NumPy algorithm correctly, we need to support a two-year window.
all minor versions of
package-namereleased in the prior 24 months from the anticipated release date with a minimum of 3 minor versions ofpackage-name
We're actually just about caught up with that in the google-cloud-bigquery dependency.
Our minimum version is 1.11.x, which was released in April 2019. https://github.com/googleapis/python-bigquery/blob/master/CHANGELOG.md#1110 We'd be able to start incrementing minimum version to 1.12.0 in May.
Thoughts? Is 2 years the right window?
Oh, and re: pandas, it looks like we're about there too. We require pandas>=0.23.2, which means if we did a release today I think we could bump the minimum version to 0.24.2, as it was released in March 2019 and we'd be support >= 3 minor versions (0.24.x, 0.25.x, 1.0.x, 1.1.x, and 1.2.x).
Yes, we did something similar in xarray: http://xarray.pydata.org/en/stable/installing.html#minimum-dependency-versions
We also have a tool for getting the relevant version of dependencies, which may be useful if there's a number of them