pip icon indicating copy to clipboard operation
pip copied to clipboard

Relaxing / Ignoring constraints during dependency resolution

Open stonebig opened this issue 4 years ago • 125 comments

What's the problem this feature will solve? Puting together some packages that have by default incompatible constraints.

Indeed:

  • constraints are often meant by the package maintainer as: . "accepting complains on this known/focus set of package", . you're on your own support if you deviate, but not necessarly bad.
  • packages rarely focus on the same versions of complementary packages. ==> The new resolver may create more problems than solutions, when trying to build an environment with a large set of package.

Describe the solution you'd like Be able to ignore voluntary some constraints

Wish:

  • we can put some "relax" rule to over-rule too strict packages (for our need), because we know what we want: pip install Spyder --relax relaxrules.r , with relax file below meaning: . if you want PyQt5, it must be 5.14.x . if you want Jedi, it must be >=0.16
PyQt5~=5.14
Jedi>=0.16

Alternative Solutions Today:

  • I have to manually recompile from source "too strict" packages, to workaround this,
  • or I would have to build one virtualenv per package,
  • or I would only be able to use a specific Python distribution, with much older packages and Python version.

Additional context Maintaining WinPython

** Current pip check **

  • datasette 0.39 has requirement Jinja2~=2.10.3, but you have jinja2 2.11.2.
  • astroid 2.3.3 has requirement wrapt==1.11.*, but you have wrapt 1.12.1.

** Current workaround **

  • Spyder manually recompiled to accept : . PyQt5-5.14.2 (as pip doesn't have a "long term support" of PyQt5-5.12, so the fresher version is safer)

other wishes:

  • a basic GUI on Pip (tkinter or web) would still be nice, to have a better view of all the coming version conflicts.

stonebig avatar Apr 18 '20 11:04 stonebig

@stonebig Would you mind separate the other wishes part into their own issues? It would be much easier to discuss them that way.

As for relxing, I am honestly uncomfortable with having such a impactful feature handy for the general audience. I was by chance also having a similar discussion in the Pipenv tracker, and both the Pipenv one and your are exactly situations I personally think the maintainers have a valid point to restrict the versions, and a user should not be able to override them easily. It’s still useful to have this option somewhere since indeed there are packages with incorrect metadata out there, but pip is too low in the packaging management realm to implement the feature IMO.

uranusjr avatar Apr 18 '20 11:04 uranusjr

Ok, separating other whishes in a few minutes. this is moved on another issue:

- having the beautifull "pipdeptree" features in the standard pip:
    . a beautifull description of what package needs (or is needed) by what package of what version,
    . the possibility to get that programatically as json answers.

stonebig avatar Apr 18 '20 11:04 stonebig

Thanks for filing this @stonebig! I've gone ahead and re-titled this to issue to be more clearly scoped.

We have seen multiple groups of users express interest in a feature like this. @pfmoore @uranusjr and I have had this come up in our discussions during our work on the resolver, and we are aware of this user need.

We don't know how exactly this would work and what approach we'd be taking here -- we're gonna visit this specific topic at a later date, once the new resolver implementation is at feature parity with the current resolver.

pradyunsg avatar Apr 18 '20 16:04 pradyunsg

a basic GUI on Pip (tkinter or web) would still be nice, to have a better view of all the coming version conflicts.

This is a completely separate request, and can be built outside of pip and doesn't need to be built into pip. If someone wants to build this outside of pip and later propose bringing it into pip (with clear reasoning for why it can't live outside pip), that'd be perfect. I don't think pip's maintainers are going to be developing/integrating this into pip, and I welcome others to try to build such tooling on-top-of or outside of pip.

I think there has been a "pip GUI" project undertaken as part of IDLE in the past, but I don't have the time to take a look right now. :)

pradyunsg avatar Apr 18 '20 16:04 pradyunsg

I hope that in the new resolver project, easy to use functions will be provided to facilitate emergence of a GUI project

stonebig avatar Apr 18 '20 19:04 stonebig

Building a dsitribution WinPython is quite simple:

  • download in a dedicated directory all the wheels (and version) you want,
  • then pip install -r requirement.txt
  • then one by one, try to fix all the problems:
    • missing wheels,
      • pip download --dest,
      • or cgohlke site of wonders
      • or github/gitlab (/pip-forge one day ?)
    • non-existing wheels,
      • do-compile-yourself (often fails, like for cartopy, or for Python-recent version)
      • raise issues to package maintainer
    • wheels whose beloved version of dependancies mutualy contradicts
      • ask maintainer to relax or upgrade his/her dependancies (very slow process)
      • recompile it yourself without the annoying constraint,
      • go back in version to an older one (with potential security or known issues fixed since ages)
      • or drop the wheel.

I dream of a way to reverse the problem:

  • showing package maintainer how their 'too restrictive' constraints makes them incompatible with the rest of the world, (hence a GUI or a pypi website feature ?):
    • give your requirements.txt
    • precise your "beloved" package,
    • the site/gui tells you what fits / what contradicts / what downgrade your package imposes
  • or do a third kind of constraints on dependancies in wheel specification:
    • supported constraints (you can speak of a problem to the maintainers when you have this "set"),
    • a "support_requires" next to "install_requires" and "extra_requires" ?

stonebig avatar Apr 19 '20 05:04 stonebig

Just to note that, while I agree that over-restrictive requirements can be an issue1, this is a fairly specialised use case. It's not that dependencies can't clash, but that putting together a Python distribution involves including (and managing) a lot of libraries that potentially have no "natural" reason to expect to be used together. So dependency clashes that the library maintainers haven't anticipated/haven't seen before are likely to be more common.

Using --no-deps and manually managing dependencies for problem packages is one option here. It's tricky without some means of identifying where the problems lie, though - we're hoping to give good error reporting for dependency clashes in the new resolver, but how to best express the information is something we don't really know yet, so that may be something that will need to be improved over time. (It might also be possible for a 3rd party tool to help here - dependency resolution and dependency graph analysis and visualisation are somewhat different problems, and separate tools may be able to focus on the different aspects of the problem.)

It's also entirely possible that pip could have options to ignore or relax certain dependency constraints. As a general problem, it could be hard to get a good UI for this (we're currently explicitly doing user research into what users want from the new dependency resolution - @ei8fdb you may want to invite @stonebig to get involved in that, if they aren't already). And I worry that while such a feature would be invaluable for specialists like @stonebig, it could easily be abused by naive users ("Cannot install X because Y is installed" - "just say --ignore-dependency=Z") and generate more confusion than it addresses - that's a further trade-off that we need to consider.

Sorry, there's no immediate answers in this, but hopefully it adds some context to the issue and explains what we're looking at when deciding how to address it.

I should also point out that this may not be something that makes it into the initial release of the resolver. Correct behaviour while satisfying the declared dependencies has to be the first priority, as I'm sure you'll understand. So --no-deps or recompiling with altered dependencies may remain the best answer for the short term.

1 I've made the argument myself that libraries should avoid over-restricting dependencies.

pfmoore avatar Apr 19 '20 08:04 pfmoore

There are some more use cases outlined in https://github.com/python-poetry/poetry/issues/697

hauntsaninja avatar Jun 18 '20 06:06 hauntsaninja

The use cases noted in the poetry issues are good to have as examples of where strict dependency resolution can cause issues, but I'm in agreement with @sdispater that ignoring declared dependency data is very dangerous and not usually the right way to handle this issue.

If a project declares certain dependency data then there are three possibilities:

  1. They are correct, and using a different version of the dependency is going to cause errors.
  2. They are correct, but only certain types of usage will cause errors. It's not really an installer's job to make this judgement, but users who have reviewed the code in detail and can be sure that they will never hit the cases that cause errors may want to override this decision. This seems to me that it should be a fairly rare situation, and the users involved can be assumed to be expert (as they are willing to trust their analysis over the declared dependencies and the resolver's calculations).
  3. They are wrong, and you should file a bug against the project asking them to relax the dependency. Obviously, projects may not accept such a bug report, but then we're in the same situation as any other case where a bug gets left unfixed. Users can make their own local fix, or find a workaround.

In pip's case, pip install --no-deps and manually handling the process of installing the correct dependencies is an available approach for working around such issues. It's awkward, and not for the faint hearted, but IMO we don't want to make it too easy for people to ignore declared dependencies (for the same reason that heavy machinery has safety guards...)

If there is a genuine need for dependency data overrides, that pip has to address, then I would argue that the need is not limited to a single tool, and should be standardised - maybe a "local metadata override file", whose format is standardised and can be implemented by any tool that does dependency resolution (pip, poetry, pipenv, ...). This would mean that users can declare such overrides once, and not be tied to a single tool's implementation. It also means that any edge cases and potential risks can be identified and addressed once, rather than having every project go through the same process.

pfmoore avatar Jun 18 '20 08:06 pfmoore

Adding my use case from #8307 for ignoring a pinned sub-dependency when that dependency is the thing being developed locally.

In Jinja, I started pinning development dependencies with pip-compile from pip-tools. One of the development dependencies is Sphinx to build the docs. Sphinx has a dependency on (the latest release of) Jinja, so pip-compile adds a pin for Jinja. I want to provide one pip command for new contributors to set up their development environment with the the pinned dev dependencies and Jinja in editable mode. I want this command to remain simple, so that new contributors can have an easy time getting started.

$ pip install -r requirements/dev.txt -e .

However, the pinned sub-dependency on Jinja takes precedence over the direct flag to install in editable mode, so Jinja2==2.11.2 is installed instead of Jinja2==3.0.0.dev0. This causes tests to fail, because they import the old version instead of the development version that new tests are written for.

I have a similar issue with Click. It has a dev dependency on pip-tools, which has a dependency on Click. A few new contributors were confused because pip list showed that Click was indeed installed, but tests were insisting that new things were not importable and failing.


I see -e . as a direct command to use the local version rather than any pinned version. I see -e . written out on the command line as more direct than a pinned sub-dependency pulled from a file. I don't see a legitimate case where the user asks for a local editable install but pip refuses because there's also a pinned dependency for that library. -e . is a direct request to develop at the local version, regardless of the fact that something might depend on a released version.

davidism avatar Jun 18 '20 15:06 davidism

If you don't mind, I'm going to leave the question of how you view -e . for now. I see your point, and it has some merits, but I want to explore your underlying use case a bit further before tackling solutions, so that I'm sure I understand it.

You say you have the latest production version of Jinja pinned in your requirements/dev.txt. But that says to me "in order to have a correct development environment set up, you must have Jinja2==2.11.2. That's clearly not the case, as it appears that the in-development version of Jinja works just as well (as otherwise your preferred outcome, that the local copy takes precedence, will cause failures). So why not have Jinja2>=2.11.2 in your requirements file? That surely gives you the expected outcome while still allowing installation of the in-development version?

I wonder if the problem here is that your workflow, or the tools you are using, are resulting in over-strict pinning, which means that having Jinja2>=2.11.2 as a requirement is harder than it needs to be. I can understand that, but I want to confirm if that is the limitation here, or if there's some more fundamental problem that I'm not understanding yet.

pfmoore avatar Jun 18 '20 15:06 pfmoore

Neither pip-tools nor Dependabot (which uses pip-tools) have the capability of doing anything but pinning exact dependencies. Both those projects are fairly common now, it's why I chose them. Plenty of other projects will be using them, I'm just in the more unique case that I develop projects that the projects I depend on depend on.

I'm not really clear what pip-tools could do here, since it's designed to pin exact versions. Jinja isn't a direct dependency of itself, all there is in the template file is Sphinx. Anything pip-tools does also needs to be understood by Dependabot, otherwise we lose automation. If you have any input about that, I opened an issue a while ago before I opened an issue here: jazzband/pip-tools#1150.

davidism avatar Jun 18 '20 17:06 davidism

Yup, I definitely agree with sdispater on the the principle of the thing. The one thing I'll note is that it's easier for poetry to be principled than pip: currently the solution for all problems on that thread is to fallback to pip. Both hoping for timely upstream releases and using --no-deps is more painful (whether or not the user is expert), so not supporting an easy workaround should be seen as eating into pip's churn budget. Obviously, you're in a better place than I am to judge whether pip should afford that :-)

In terms of what standardised overrides could look like, the poetry issue also had some ideas. Someone linked to https://classic.yarnpkg.com/en/docs/selective-version-resolutions/#toc-why-would-you-want-to-do-this describing yarn's solution that could be useful to refer to.

hauntsaninja avatar Jun 18 '20 18:06 hauntsaninja

Neither pip-tools nor Dependabot (which uses pip-tools) have the capability of doing anything but pinning exact dependencies.

OK, cool. So (without making a comment on the importance of addressing this issue) I think it's fair to characterise this as asking for pip to provide a way to work around the limitations of pip-tools and/or Dependabot.

Thanks for clarifying.

currently the solution for all problems on that thread is to fallback to pip

Yes, but you have to remember, that what you're falling back to is relying on a buggy resolver in pip. Pip has never had a solution for this issue, all it's ever had is bugs that mean that people can get it to do things that aren't correct - and by failing to enforce constraints, pip has encouraged the community to think that ignoring constraints is OK, rather than being more accurate when specifying constraints.

And yes, this is sophistry, and the reality is that people do rely on pip's current behaviour. And we do take that very seriously. But we also have people complaining about the problems that pip's buggy resolver causes, and we have to balance the two priorities. It's hard to credibly say "we've decided that we won't fix known bugs because the buggy behaviour is useful"...

this should be seen as eating into pip's churn budget. Obviously, you're in a better place than I am to judge whether pip should afford that

Oh, boy, are we aware of that 🙂 In all seriousness, thanks for acknowledging that this is a difficult trade-off. One of the things we're looking at with the new resolver implementation is trying to bundle these sorts of things together, so there's one well-defined move to a more "correct"¹ behaviour, rather than a trickle of breakages that leave users with an extended but continually changing "getting there" phase. Hopefully that strategy will turn out OK. It won't please everyone, but I doubt anything can do that.

One irony here is that a lot of what we're doing is focused on making the bits that make up pip more reusable and standardised, so that building alternatives to pip is a viable idea. And a lot of the tolerance people have for churn is because there isn't really a good alternative to pip.

In terms of what standardised overrides could look like

Getting back to more practical considerations, thanks for the link to yarn. I don't know if any of the other pip devs have looked at yarn (I suspect someone has) but it's certainly worth seeing how they deal with this sort of thing.

For information, as part of the funded pip improvements work, we also have some dedicated UX specialists looking at how to improve pip's user interface, and this is one of the problems they will be working on (under issue #8452 linked above). So I'm sure they will be following up on this in some detail.

¹ Yes, "correct" is in the eye of the beholder, here, unfortunately.

pfmoore avatar Jun 18 '20 19:06 pfmoore

I think it's fair to characterise this as asking for pip to provide a way to work around the limitations of pip-tools and/or Dependabot.

While one way to demonstrate this issue is with these tools in their current state, the issue is with pip ignoring a command to install in editable mode, instead preferring a dependency resolution that is not useful.

davidism avatar Jun 18 '20 22:06 davidism

As I said, I was (deliberately, for the sake of understanding the use case) ignoring your view on how -e should be interpreted.

Let's put it this way. The behaviour you want is available right now if you were able to use >= requirements. But you can't use that type of requirement, so you have no options with existing tools.

As you suggest, one possible way of getting the behaviour you want without >= constraints would be to reinterpret -e as meaning "Install this and ignore all other requirements". However, please understand that this is not "pip's current behaviour". It may look similar, but what pip is actually doing at the moment, is picking one requirement to satisfy and ignoring all others. When the requirement picked is -e, you get behaviour that is useful to you, but when different types of constraints are involved, this results in broken installs. It's a known, long-standing issue that we have always described as a "bug", not as an implementation choice. We've fixed this bug in the new resolver, so that pip now takes into account all requirements equally. But in doing so, your convenient workaround for your problem, exploiting that bug to your advantage, no longer works.

Please understand, I'm not against the idea that if someone requests an explicit distribution file (whether a local directory, or a sdist or a wheel, whether with -e or without) then they want that to be installed. That makes perfect sense to me, and it's actually what the new resolver does. What's less obvious is how pip should react when given two contradictory requirements ("I want this file, but I also want a version that's different than what this file gives"). You're saying ignore version specifiers if an explicit file is given. Pip's new resolver says report the problem and ask the user to fix it. This discussion is about maybe giving the user a way to control the choice without needing to fix the sources, but leave the default as "report the problem".

pfmoore avatar Jun 19 '20 08:06 pfmoore

I don't know if any of the other pip devs have looked at yarn (I suspect someone has)

I hadn't. That's basically the same model as I've mentioned in discussions about/for dependency resolution "overrides" in pip.

FWIW, I do think we need to figure out how important it is for users, especially those who've been using pip's buggy behavior as a fallback to solve their dependency resolution woes, to have this override capability. All ideas I've had to provide some mechanism to the users have is non-trivial to implement, and even if the functionality is well understood + implementable, I have no idea how we should be exposing this to the users.

w.r.t. The churn budget, I think that's primarily what we'll learn during the beta period, where we'll ask users to test the new resolver and help us figure out what to do on this topic (and others); all while keeping us from eating into too much of our churn budget, since these are users clearly opting into testing beta functionality.

I do think we'll have to resolve this appropriately before making it the default (as indicated by where we've put this in the tracking board), and the understanding gained from the user testing during the beta, will be pretty important in that. :)

pradyunsg avatar Jun 19 '20 10:06 pradyunsg

For the purposes of being precise, I believe that an actionable version of @davidism's suggestion would be:

  • If an editable requirement is provided, pip should ignore any version specifier requirements for the same project.

Some possible variations on this:

  1. Rather than editables, extend this to all direct links (pip install my/local/project or pip install path/to/project.whl)
  2. Rather than silently ignoring version constraints, warn and ignore if version constraints that won't be satisfied are encountered.

There may be other possible variations with different trade-offs.

pfmoore avatar Jun 19 '20 11:06 pfmoore

@pfmoore thanks for your patience, your further explanations clarified things for me.

davidism avatar Jun 19 '20 14:06 davidism

@davidsim Not at all, pip's resolver has been broken for so long that it's really hard to untangle what counts as "broken" and what is "behaviour that people need that accidentally worked". This isn't the only place where we'll need to look very carefully at the transition, and how we handle "urgent feature requests exposed because no-one realised that what they were relying on were bugs".

Getting good involvement from users like yourself is crucial to getting that transition right, so your help is much appreciated.

I just wish all of the people relying on undefined behaviour of pip were doing things as unreasonable as this: https://xkcd.com/1172/ - it'd be much easier to not worry about it 🙂

pfmoore avatar Jun 19 '20 15:06 pfmoore

Python's packaging tools in particular have fallen victim to Hyrum's Law, and this is really just another case of it. It is unlikely that the resolver lands without breaking some non zero number of workflows/installs. The only thing we can really do is try to figure them out as much as possible before hand, figure out which ones we do not plan to support, which ones we want to continue to support in the same way, or which ones we want to provide some new mechanism for supporting.

I suspect we're going to get a lot of noise at first once the resolver lands, but that's pretty much always the case when you go from nonstrict to strict behavior.

dstufft avatar Jun 19 '20 15:06 dstufft

Another use case here. pypi does not have any way multiple packages can "provide" the same resource. It is not uncommon on pypi to have the same package packages in a different way under multiple names like:

  • psycopg2 and psycopg2-binary
  • opencv-python-headless and opencv-python

It should be possible to choose which one use to fulfil a library requirement at the project level.

frankier avatar Jun 25 '20 05:06 frankier

Another example of yesterday, preparing a build of WinPython.

When I include latest and freshest possible Tensorflow, it asks me to:

  • downgrade to Scipy-1.4.1 (7 month old)
  • downgrade to numpy<1.19.0 (1 month old, removing a pile of technical debt)
pip check
tensorflow-cpu 2.3.0rc2 has requirement numpy<1.19.0,>=1.16.0, but you have numpy 1.19.1+mkl.
tensorflow-cpu 2.3.0rc2 has requirement scipy==1.4.1, but you have scipy 1.5.2.

with PIP of today:

  • I can recompile simple wheels , like Spyder as:
    • Spyder doesn't want PyQt-5.15 only PyQt5-5.12, because it's not available on condaforge,
    • but I can test it works good enough and more securely enough for my narrow use case
  • I can't recompile complex wheels like Tensorflow, but pip let me ignore Tensorflow limitations just warning me,

Dilemna to be created per PIP of tomorrow:

  • either I bow to the slowest package development cycle and don't use numpy-1.19.1 / Scipy-1.5.2:
    • dragging my feet with technical debt, (not using new features/value created 7 month ago or more)
    • slowing upgrades when Python cycle itself is accelerating, and context is moving faster with the pandemic
  • or I drop some important packages that I can't relax with my bare hands.

stonebig avatar Jul 27 '20 07:07 stonebig

@di I think maybe you have relevant thoughts here?

brainwane avatar Jul 29 '20 20:07 brainwane

With regards to TensorFlow: for the scipy dependency, this is definitely over-constrained and will be removed in the next release, see:

  • PRs: https://github.com/tensorflow/tensorflow/pull/41865, https://github.com/tensorflow/tensorflow/pull/41866, https://github.com/tensorflow/tensorflow/pull/41867
  • Issues: https://github.com/tensorflow/tensorflow/issues/40884, https://github.com/tensorflow/tensorflow/issues/35709, https://github.com/tensorflow/tensorflow/pull/40789

For the numpy dependency, this version of numpy apparently has a breaking ABI change that the TensorFlow project is not prepared to migrate to, but should be eventually fixed. I filed https://github.com/tensorflow/tensorflow/issues/41902 to ensure the TensorFlow maintainers are aware, if you are currently using TensorFlow with numpy >= 1.19.0 please leave a 👍 there.

I think ultimately, instead of having pip be able to relax it's constraints, we should embrace this friction as a forcing function to get projects with less-than-ideal dependency specifications to either fix them, or work towards relaxing them, as it will improve the overall ecosystem.

di avatar Jul 30 '20 16:07 di

I hope you're right, and it would go towards a strictness more compatible with "conda".... yet pip is not a distro, so a "--relax" option would help soften the transition on the first year.

stonebig avatar Jul 30 '20 17:07 stonebig

It is nice for abstractions to have escape hatches for when they break down. Take for example Django's ORM, which --- at least in the early days --- as a design decision only covered 80% of use cases and encouraged "dropping down" to SQL for the remaining 20%. When people deny that the escape hatches should exist, it is often framed in moralistic terms: anything that does not use the abstraction correctly is wrong and should be fixed. A consenting adults approach which allows escape hatches dispenses with simple moralistic arguments and instead seeks to provide maximum utility without attempting to dictate "best practices" which are supposed to apply against unseen and unknown contexts:

In this case, the escape hatch would:

  1. Allow papering over problems with the dependencies of individual packages;
  2. Allow papering over edge cases in the ecosystem such as a shortcoming in the specification of dependencies where no metapackage or supplies -type mechanism exists and so it is impossible to have multiple packages fulfill the same dependency name.

If the escape hatch is not added, it does not mean that the whole packaging ecosystem will magically improve. Instead, users faced with an unresponsive upstream will be forced to make their own ad-hoc escape hatches such adding manual instructions to READMEs, manual installation shell scripts, usage of git submodules ,and passive-aggressive forks which are almost immediately unmaintained.

frankier avatar Jul 30 '20 18:07 frankier

@frankier Thanks for sharing your thoughts.

As I see it, the "escape hatch" would be sticking with pip 20.2.

Some further thoughts:

The more lenient framework you have in mind makes sense for "victimless crimes" where no one other than the people involved are affected. However, pip's maintainers have to deal with support requests from users who get tangled up in incompatible dependencies and resolution conflicts. Also, the fact that we can't depend on the user's installation being consistent blocks the development of a lot of features which we, and many users, want. Check out the "We need to finish the resolver because so many other improvements are blocked on it" section there for several examples, such as adding an "upgrade-all" command to pip.

If you or others are volunteering to donate a bunch of money so that pip can hire multiple full-time maintainers, or you or others are donating your services to maintain your proposed "escape hatch" and/or respond to the user support queries pertaining to it, then please let us know, as that changes the equation! Currently, the only reason anyone's being paid to work on pip is that we wrote some grant proposals and got some money that will run out at the end of the year.

I'd also like to know of the unresponsive upstreams that have at least, say, 100+ users and that completely ignore those users telling them "the upcoming version of pip simply will not install your package". I think we'll learn more in the next few weeks of the beta to see how many of those there are. If there are scores of such packages then that will influence our decisionmaking -- and, I hope, help people and companies that depend on those packages decide to invest in and rejuvenate them.

Another use case here. pypi does not have any way multiple packages can "provide" the same resource. It is not uncommon on pypi to have the same package packages in a different way under multiple names like:

* psycopg2 and psycopg2-binary

* opencv-python-headless and opencv-python

This seems to me like something that the upstreams could work on fixing on their side; what do opencv and psycopg2 say about the upcoming change to pip's dependency resolver?

It should be possible to choose which one use to fulfil a library requirement at the project level.

Could you please file this as a separate issue so we can discuss it separately? Thanks!

brainwane avatar Jul 30 '20 19:07 brainwane

I think you already know that I don't have any resources to offer. Incidentally it's long-tail projects including those that have never received any funding which would benefit most from this feature, while projects with 100+ users or backed by Google will surely adapt. You will receive very skewed information if you only ask libraries and upstream-level project since this issue is about giving more control to downstream projects. Upstream projects will either be responsive and not mind, or else not respond. Nevertheless when framed as a matter of priorities it's indisputable.

I have filed the issue about virtual packages here: https://github.com/pypa/pip/issues/8669

frankier avatar Jul 31 '20 05:07 frankier

We have a tough situation here and I'd love thoughts from @chrahunt @xavfernandez and other pip maintainers.

My current thinking: people who need an escape hatch within pip 20.3 should use --use-deprecated=legacy-resolver. Per the deprecation timeline they will then have three months (till pip 21.0 comes out in January) to get upstreams to get their houses in order.

brainwane avatar Sep 02 '20 14:09 brainwane