pip-tools
pip-tools copied to clipboard
Survey: Best Practices for Updating requirements.in
Hello fellow pip-tools users,
I'm new to pip-tools and would appreciate if you could share how you update your requirements.in file. I'm currently updating it manually for a small project but am entertaining the idea of adopting a, hopefully, simple process to automatically or semi-automatically update requirements.in -- an automation I foresee will possibly be quite convenient for using pip-tools in large projects.
-
How are you updating
requirements.inin your projects? -
Is there a command, or sequence of commands to list packages that have yet to be captured in
requirements.inso that the user may consider their inclusion inrequirements.in?
Speaking of automatically listing python packages in an environment, I understand that there is the vital task of "ignoring" packages which are purely dependencies while only surfacing "top-level" packages. E.g., conda-minify analyzes the dependency graph of a given python environment and extracts only "top-level" packages.
Discussions here could possibly be fed into documentation? Noticed #1794 calling for more detailed documentation 😄
I'm the maintainer of https://github.com/indico/indico and here's what I do:
How are you updating requirements.in in your projects?
I think automating it is a terrible idea. you want to check the changelogs of your dependencies, especially in case of major version bumps.
Personally what I do is doing pip-compile -U every few weeks, then going through the updates, checking the individual changelogs, and then either pinning some updates to the previous version (in requirements.in file) I want to postpone (then compiling again of course) and eventually installing the new versions, running tests and possibly doing some manual testing depending on what I saw in the changelogs.
Even for a big project (about 70 lines in requirements.in) this is not too time-consuming, and I feel much more confident that I don't introduce some weird bugs by doing it that way.
Is there a command, or sequence of commands to list packages that have yet to be captured in requirements.in so that the user may consider their inclusion in requirements.in?
I'd start with an empty one, add the dependencies I'm aware of (e.g. by checking imports), then compiling and installing (in an empty virtualenv). If things fail, it should ONLY be because of missing extras or some rare cases where a package is not imported directly but via entrypoints. Easy to fix.
And from that point on you NEVER install a new package manually - you ALWAYS add it to requirements.in, then run pip-compile and only then install to your env from requirements.txt
Appreciate sharing how you use pip-tools @ThiefMaster. After reading your comment, things appear to make more sense. Prior to this, the unhelpful mental model I held as someone who was new to pip-tools was relying on conda (or another environment manager) mainly while only coming to pip-tools when confronted with a need to distribute the python environment (say via requirements.in, requirements.txt). I think the right mental model to is to use pip-tools primarily by editing requirements.in and using pip-tool commands to install/remove packages (basically synchronize) the environment.
I have a similar question regarding best practices.
Under the readme
The pip-compile command lets you compile a requirements.txt file from your dependencies, specified in either pyproject.toml, setup.cfg, setup.py, or requirements.in.
So which is the preferred way for django project, django plugin, regular python library?
- pyproject.toml
- setup.cfg
- setup.py
- requirements.in
Thank you
@simkimsia
I think most are in agreement that if you can avoid the setup.* files, do so, and that if you can use pyproject.toml, do so.
There is not consensus on my own preference of always having requirements.in, and scripting injection of its contents into pyproject.toml if that exists.
Hi, I was also wondering about this but haven't come to a conclusion yet.
So for my projects, I like to install them in develop mode for developing (duh!) from setup.py or pyproject.toml more recently. I like to have my project code available as an absolute import.
So for adapting pip-tools my impression is that I would now need three files:
- requirements.in: To define all dependencies without version
- requirements.txt: To define all dependencies with locked version
- pyproject.toml: To include the dependencies from requirements.txt
Does that make sense?
I never worked with a larger project where you should freeze requirements like that so it seems a bit weird to spread everything over 3 files.
@aranvir
You probably wouldn't want to include the contents of requirements.txt in pyproject.toml, which should be as liberal as possible with its version requirements.
You also don't necessarily need requirements.in, as you can compile pyproject.toml to requirements.txt.
You may find it useful to use more files, for example a local-requirements.txt like:
-e .[dev]
-c requirements.txt
Hi @AndydeCleyre thanks for the feedback. I played around with it a bit and got a better feeling for it.
I think my question now is: How do I use this when building a wheel? If I only define high-level dependencies in pyproject.toml and have the pinned versions in requirements.txt, my build process will not consider them right? Or does it depend on the build system? Or is it just the wrong use case, ergo, if I want to build a wheel, I cannot use pip-tools for pinning.