pip-tools
pip-tools copied to clipboard
Improve layered dependency workflow
What's the problem this feature will solve?
I know this has been discussed a few times (e.g. #398), but the current solution for layered dependencies still doesn't work in all cases. For example the following simple files cannot be compiled:
# base.in
requests
# dev.in
-c base.txt
moto
Running pip-compile base.in
results in
certifi==2019.11.28 # via requests
chardet==3.0.4 # via requests
idna==2.9 # via requests
requests==2.23.0 # via -r base.in
urllib3==1.25.8 # via requests
Now pip-compile dev.in
aborts with an error:
Could not find a version that matches idna<2.9,==2.9,>=2.5 (from -c base.txt (line 9))
Tried: 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.1, 2.0, 2.0, 2.1, 2.1, 2.2, 2.2, 2.3, 2.3, 2.4, 2.4, 2.5, 2.5, 2.6, 2.6, 2.7, 2.7, 2.8, 2.8, 2.9, 2.9
There are incompatible versions in the resolved dependencies:
idna==2.9 (from -c base.txt (line 9))
idna<2.9,>=2.5 (from moto==1.3.14->-r dev.in (line 2))
A solution exists, though: Just use idna==2.8
.
Describe the solution you'd like
In my opinion, a solution to the layered dependencies problem requires to compile the outermost layer first: Then the resolver has a chance to find a solution for all dependencies. A working solution is the following:
# base.in
requests
# dev.in
-r base.in
moto
# constraint.in
-c dev.txt
Now I can generate correct dependency sets with
pip-compile dev.in
pip-compile base.in constraint.in --output-file base.txt
The constraint.in
is necessary because otherwise the second pip-compile
will generate versions that differ from those in dev.txt
.
What do you think of this solution? How about adding a -c/--constrain
option to pip-compile
, that acts like I added a constraint.in
file above. Then I could run
pip-compile dev.in
pip-compile -c dev.txt base.in
with the same effect as above.
Hello @MartinAltmayer,
Thanks for the feedback on the layered workflow! Actually, requirements can be compiled without constraint.in
using -r base.txt
approach:
# base.in
requests
# dev.in
-r base.txt
moto
$ pip-compile base.in
$ pip-compile dev.in
The only downside is dev.txt
will contain all dependencies from base.txt
. However, there is another way to compile with -c base.txt
, where idna
package can be pined manually to 2.8
version using --upgrade-package
option. See:
# base.in
requests
# dev.in
-c base.txt
moto
$ pip-compile base.in
$ pip-compile base.in --upgrade-package=idna==2.8 # pin manually to 2.8
$ pip-compile dev.in
What do you think?
Since pip-tools
provides several ways to help pip's resolver I'd prefer not to add new option but would love to hear other opinions.
Thanks for your quick response!
The -r base.txt
approach gives the same error.
Manually pinning the version to 2.8 would certainly work. However, in large projects that could mean pinning many different dependencies to compatible versions. Isn't this exactly the work pip-compile should do for us?
I understand that you hesitate to add more options to pip-compile and would also be interested in other opinions or proposals.
Another funny way to get constraints behavior is to compile the "outer" one, then copy its output over the (potential or existing) output of the inner one.
# base.in
requests
# dev.in
-r base.in
moto
$ pip-compile dev.in
$ cp dev.txt base.txt
$ pip-compile base.in
Thanks @MartinAltmayer for sharing this issue - it's good food for thought.
The workflow for layered requirements operates "above" the scope and responsibility of pip-tools
. For example, I use GNU Make to control pip-tools
to build multiple requirements files. Others use tools like pip-compile-multi. The examples above are using shell to orchestrate the ordering of calls to pip-tools
, copying of files, etc.
In #398 we've been lucky enough to create a consensus that -c
constraints passed to pip
are the official way to invoke pip-tools
when compiling a layer within a multi-layer requirements system. This means that pip-tools
can continue to focus on compiling a single requirements file at a time. It means that responsibility for managing the separate layers remains outside of pip-tools
scope.
I recommend against implementing the feature suggested of adding -c/--constrain
to pip-tools
. At first glance with the example above it is being used to implement a circular dependency - which hides the real solution to these layered builds. In these cases of conflict (and maybe in all cases for safety / reliability), each layer may need to be rebuilt with the dependencies of the others included.
Instead, here's an example of a solution using make
. It builds the dependencies for the entire system first, and then builds each layer constrained by that system. Given that both base.in
and dev.in
are now constrained by system.txt
as follows:
# base.in
-c system.txt
requests
# dev.in
-c system.txt
moto
Then this all
recipe can successfully compile the example given above, leaving idna==2.8
in base.txt
:
all:
rm -f system.in
echo "" > system.txt
cat *.in > system.in
# Build requirements for the whole system
pip-compile system.in
# Build requirements for each layer, constrained by the whole system
pip-compile base.in
pip-compile dev.in
Therefore, my guess is that this issue should be closed - not because it's not a valid concern, because it is. Rather because it's outside of scope of pip-tools
functionality.
My current wondering is whether this should be handled in documentation. :thinking:
Thanks, @jamescooke, for an excellent explanation! I've had similar intuition in mind but couldn't articulate my thoughts.
Also, that system.in
trick is cool!
My current wondering is whether this should be handled in documentation.
I would refrain from opinionating the documentation too much.
For example, if we include that system.in
trick in the docs, a lot of people will do it and potentially overcomplicate their setups. But in reality, one could also make an argument that base.in
should be prioritized first (or, not compromised) in terms of dependency up-to-dateness and manually keep development dependencies in check.
Thanks @Ampretuzo :+1:
I would refrain from opinionating the documentation too much.
I agree.
Just to clarify my last comment - I'm not suggesting that the documentation should be opinionated about a single way that pip-tools
should be used. Instead I think it might be helpful if it signposts particular ways that pip-tools
could be used to solve common problems in a recipe / cookbook manner.
Even then, there isn't a one-shot recipe that will solve all scenarios - especially when it comes to updating requirements. As with most things there are trade-offs, but using the documentation to highlight potential pitfalls can make the tool more user friendly.
@jamescooke IIUC, your solution above forgets the pinned versions of any previously existing base.txt and dev.txt, since system.txt is generated afresh each time.
Now, suppose we have the following files:
# base.in
requests
# dev.in
moto
# base_with_c.in
-c system.txt
requests
# dev_with_c.in
-c system.txt
moto
Then we can do the following:
(script to create the with_c.in files from the regular .in files)
pip-compile base.in dev.in -o system.txt
pip-compile base_with_c.in -o base.txt
pip-compile dev_with_c.in -o dev.txt
Of course it would be nicer if one could get rid of the *_with_c.in
and simply do
pip-compile *.in -o system.txt
pip-compile -c system.txt base.in
pip-compile -c system.txt dev.in
Any solution allowing pip to selectively see a -c
in a given input file would do the trick here, but a -c
switch to pip-compile seems to be the simplest possibility.
IIUC, your solution above forgets the pinned versions of any previously existing base.txt and dev.txt
Yep 👍🏻 - it's an omission on my part because I didn't state that I would have expected them to be actually deleted before updating. Sorry for that.
Back when I suggested the workaround in the comment above, my experience was that leaving .txt
files in place when updating layered requirements created undue complexity when trying to reason about updates. In general, the Make recipes that I write usually have make clean
which removes all .txt
files and this clean recipe is run before the packages are updated.
In the last two years since writing that workaround, I've not had a situation occur where I've needed to leave the .txt
files in place when updating.
A couple issues with using pip's constraint support:
- If you have a requirement with an extra, it fails, (at least in v22.2.2), you get:
ERROR: Constraints cannot have extras
- It's not valid to use in
pyproject.toml
dependencies. You get an error that all dependencies must be in PEP 508 format.
It's not pretty, but this is currently working for me with pyproject.toml
and constraints:
# generate requirements.txt
pip-compile --generate-hashes --output-file requirements.txt --resolver backtracking --strip-extras pyproject.toml
# generate requirements-dev.txt for [dev] extras
echo "--constraint $(pwd)/requirements.txt" | \
pip-compile --generate-hashes --output-file requirements-dev.txt --extra dev --resolver backtracking - pyproject.toml
Yet another workaround:
- I wanted to use
pyproject.toml
- I wanted to place all compiled requirements in a
requirements/
subfolder. - I didn't want to clutter the root directory with
dev.in
, etc. just to define constraints.
One problem with @ipmb approach is that the annotations will show as
foo=1.0.0
# via
# -r -
# bar
which makes it hard to resolve.
Instead, I simply put a dev.in
and test.in
in my requirements/
subdirectory with following content:
# ./requirements/test.in
-c base.txt
# ./requirements/dev.in
-c base.txt
-c test.txt
Commands:
pip-compile pyproject.toml --output-file=requirements/base.txt --resolver=backtracking
pip-compile pyproject.toml requirements/test.in --extra=test --output-file=requirements/test.txt --resolver=backtracking
pip-compile pyproject.toml requirements/dev.in --extra=dev --output-file=requirements/dev.txt --resolver=backtracking
How about adding a
-c/--constrain
option topip-compile
, that acts like I added aconstraint.in
file above
This PR #1936 adds -c
option to the pip-compile
. Any tests and reviews would be much appreciated.