Install dependencies from pyproject.toml like from requirements.txt with -r, (again)
What's the problem this feature will solve?
If you don't want to install a package but you do want to install dependencies, we don't have options outside of adding a second dependencies list (a requirements.txt), when all the dependencies are already listed in pyproject.toml. This means people working on a library need to duplicate dependency lists (common) or create on-demand requirements.txt files that aren't checked in (also common), when we really just need a pip install -r ./pyproject.toml that works like pip install ./requirements.txt.
Currently, repos are shipping with pyproject.toml files instead of requirements.txt files. The problem is that without a requirements.txt, there's no way to just install the requirements for these libraries as you work on them. We are seeing that several repos have brutish systems trying to both-ways it, or to simply force developers to write their own requirements.txt files if they want to develop the library. Examples:
- https://github.com/psf/black
- no requirements.txt file
- https://github.com/pydantic/pydantic
- no requirements.txt file
- https://github.com/pytest-dev/pytest
- no requirements.txt file
- https://github.com/pypa/pip (lol)
- no requirements.txt file
And then you have other repos that bend over backwards to support pip and pyproject.toml. FastAPI has one pyproject.toml file, and 7 requirements.txt files in its root directory. Yikes!
Describe the solution you'd like
Expected interface:
pip install -r ./pyproject.toml
pip install -r ./pyproject.toml[dev]
Result:
install the dependencies defined in pyproject.toml install the dependencies defined in pyproject.toml, plus dev dependencies
The advantage here is that we don't need a separate requirements.txt file, or several, and we can as a community standardize on one place to have dependencies listed.
Alternative Solutions
--deps-only doesn't seem to work in this case
The original recommendation, adding an additional dep in pip tools, is not what we want either. The whole point is a working tool chain with batteries included:
$ pip install pip-tools
$ python -m piptools compile \
-o requirements.txt \
pyproject.toml
$ pip install -r requirements.txt
Don't forget the chatgpt oneliners:
python -c "import tomllib; deps = tomllib.load(open('pyproject.toml', 'rb'))['project']['dependencies']; open('requirements.txt', 'w').write('\n'.join(deps))"
Additional context
pyproject.toml is gaining traction as as both a sane and popular way to define a python library, a rare gem in any large community. However, with its popularity we're seeing standardization, and the new standard is to include pyproject.toml and to skip requirements.txt.
The issue I found on this subject, Support installing requirements from arbitrary keys in pyproject.toml #8049, is not what we're talking about, as pip install -r pyproject.toml --key tool.enscons.install_requires was already rejected, and anyway the interface is not the greatest.
https://github.com/pypa/pip/issues/11584 was opened, but closed in favor of https://github.com/pypa/pip/issues/8049. https://github.com/pypa/pip/issues/8049 was closed for reasons described in the thread.
We still have plenty of python repos with only a pyproject.toml file, and we need to either generate our own requirements.txt files or we need
Code of Conduct
- [x] I agree to follow the PSF Code of Conduct.
I appreciate this is a feature you would like, but you gain any traction you need to explain:
- Why installing local packages with
pip install .and with extras likepip install .[dev]. doesn't work for you - What use case you have that isn't described by the existing issues you link to, as best as I can tell your use case is the same as https://github.com/pypa/pip/issues/11584
The only thing I'm aware of that has changed between now and when those issues were created is dependency groups from pyproject.toml files are now supported: https://pip.pypa.io/en/stable/user_guide/#dependency-groups
This is installing dependencies, not the package!
You can install the package just fine, but if you want to install dependencies you need a requirements.txt file or other schenanigans. I'll update my issue description
Yes, this is an already discussed use case in the linked issues.
The feature request for installing only dependencies is covered by https://github.com/pypa/pip/issues/11440 which is what https://github.com/pypa/pip/issues/8049 was closed in favor of. And https://github.com/pypa/pip/issues/11584 which this issue appears to a duplicate of was consolidated into https://github.com/pypa/pip/issues/8049.
Is there anything new you have as not just to mark this a duplicate of https://github.com/pypa/pip/issues/11584?
For starters, I don't think anyone assumed pyproject would be as popular as it is now, and the fractured dependency lists between pyproject.toml and requirements.txt is a more pressing issue that pip should support. That has changed and it appears to be a positive change, a good tool is gaining popularity and fixing issues with the python ecosystem!
Second, https://github.com/pypa/pip/issues/11584 was closed as a subset, not as a duplicate, and it is not clear that it was rejected on its own standing, or if the superset feature was closed for other reasons. Saying it was simply a duplicate is imprecise and muddies the issue. As #11584 was considered a subset of a feature that was closed, we should reconsider this functionality on its own standing and get its own ruling, and honestly it's a very good idea that most people expect to 'just work' as pyproject.toml popularity continues to grow.
I don't think anyone assumed pyproject would be as popular as it is now
Not related to this issue, but pyproject was proposed as the standard for projects, the aim was 100% of projects should eventually be using it, so from that perspective it's not surprising.
Second, https://github.com/pypa/pip/issues/11584 was closed as a subset, not as a duplicate, and it is not clear that it was rejected on its own standing
Fair enough, in terms of extracting requirements from a pyproject.toml there are a few main issues with adding this to pip as I know it:
This only works for completely static dependencies from a PEP 621 spec compliant configuration, if there are dynamic dependencies this will have to fail or result in installing the wrong thing. That's why https://github.com/pypa/pip/issues/11440 is considered a better solution, it can get the correct dependencies in all situations.
Users have multiple existing workarounds. For example, running install and then uninstall e.g. pip install foo, pip uninstall foo). Or by extracting the dependencies themselves, e.g. python -c "import tomllib;print(*tomllib.load(open('pyproject.toml','rb'))['project']['dependencies'], sep='\n') > .temp; pip install -r .temp", this could be made a one-liner if https://github.com/pypa/pip/issues/7822 is implemented.
Also, a proper UX would need to be designed, personally I don't think hijacking the existing -r flag is the best approach, now dependency groups have been implemented I would expect a similar UX to extract dependencies and additional dependencies. Somebody needs to design and implement this in a way where the maintainers are happy.
FWIW I am not against this, but I do think actual real use cases should be provided, you've said that you want to install dependencies but not the project, but why? And why are workarounds not sufficient? And why would it not be better served by https://github.com/pypa/pip/issues/11440? The justification is required partly because pip is currently an all volunteer project, maintainers only have a very limited about of time to review PRs and support features.
For starters, I don't think anyone assumed pyproject would be as popular as it is now
On the contrary, pyproject.toml was defined as the standard for project metadata, so it's not surprising it's used everywhere. What has changed is that people are using it for projects that aren't intended to be built into a redistributable wheel. That's not the originally intended use, and as of now such usage hasn't been standardised (although there have been some discussions, such as here). Feature requests like this come mainly from that usage.
In situations like this, where there are no established standards, pip deliberately avoids innovating on behaviour. That sometimes means that we are slow to adopt emerging conventions, but that's fine - people who want more innovative approaches can use one of the existing workflow tools like uv, PDM, Poetry or hatch.
The problem with "simply" allowing pip to read dependencies from pyproject.toml is that the semantics of the [project.dependencies] metadata isn't necessarily appropriate for a standalone project. Restrictions that are needed when building a wheel can be problematic for a standalone project. For example, it's not possible to specify a dependency on another local project (in a directory on your machine) because that's not portable - but it's a common need for standalone projects. Hence the reason this needs to be properly discussed and agreed in a tool-independent way - it would be a nightmare if you had to specify such things one way for pip, a different way for uv, yet another way for PDM, etc...
I have no idea what the likely timescale is for standardisation. Tools like PDM, uv, Poetry and hatch have good real-world experience to bring to the discussions, but standardisation takes time, and everyone involved is a volunteer, so we're dependent on people having an interest in the working on the subject. And right now, interest in metadata for standalone projects seems to have died down.
#11440 and #8049 were both opened a while ago and there have been at least two new PEPs, along with support in pip:
PEP 735 Dependency Groups in pyproject.toml PEP 751 – A file format to record Python dependencies for installation reproducibility
On the one hand, dependency groups are supposed to be coherent sets of dependencies that make sense as a specification of "things that should be installed simultaneously in at least some cases". They go beyond project extras and represent increased flexibility in pyproject.toml that can be useful for developers.
On the other hand, now we have lock files, and pip has experimental support for writing them and plans for installing from them. I generally agree that there are good reasons not to install directly from a list of abstract, unresolved dependencies given in a pyproject.toml file - even if that's exactly what you'd normally put directly on a pip command line.
What I'd like to see, therefore, is functionality to generate a lockfile from a dependency group (or from project dependencies or optional dependencies). Have the solver figure out what concrete dependencies are needed, and record them for later use. Then for people who do want to install directly from abstract dependencies, it would be more easily scriptable, without messing around with tools like jq etc. plus the lockfile itself would be a useful artifact.
I personally do have at least two use cases for this kind of installation:
-
I personally really dislike editable wheels - there's too much argument over what should actually go in them, they don't really solve the problem of keeping the "installed metadata" up to date (and one rarely cares about that anyway IMX), and most of all they're just inefficient to work with. I'd rather just install the project dependencies in a project-development venv, and then add a
.pthfile for the project itself. -
Installing the
build-system.requiresin a persistent build environment would allow me to disable build isolation - again more efficient, works offline (even without #8057), etc.
My own inclination is that installing from pyproject.toml directly should throw when confronted with dynamic dependencies- the main reason why --deps-only is unappealing is because the threat of build-time side effects is too high, even for --deps-only proposals, as any dynamic dependency resolution would by definition require spinning up of build machinery that may or may not be installed and may or may not otherwise modify the system.
Build systems that make use of dynamic = ['dependencies'] seem rare. Another unscientific polling of major python libraries using pyproject.toml makes it look like static deps are the norm, and this change could allow for a walk away from requirements.txt entirely:
- FastAPI (static deps)
- Black (static deps)
- Pytest (static deps)
- Pydantic (static deps)
- Django (static deps)
- Flask (static deps)
- Pandas (static deps)
- Scikit-learn (static deps)
- Scrapy (static deps)
- Sphinx (static deps)
- Matplotlib (static deps)
- Rich (static deps)
- Poetry (static deps)
Then exceptions to this rule, but still no explicit dynamic dependencies references:
- pip (no dependencies block in pyproject.toml, has the build-project directory with requirements.txt files in it, although those are build time deps)
- Requests (no dependencies block, 'requires' in setup.py has static deps)
- Celery (no dependencies block, has a "requirements" folder with very dynamic deps defined in setup.py, and appears to be going through the steps of a migration to pyproject.toml)
- numpy (no dependencies block, but has a requirements folder just chalk full of text files that are referenced dynamically)
It appears that 'version' and 'readme' are the most common dynamic fields, but I was at a loss to find dynamic = ['dependencies'].
I've written dynamic dependencies into libraries many times in the past, but the reason was always... so I could consolidate all dependency definitions into a ./requirements.txt file for when I want to install dependencies with pip! Hence my obvious opinionation on the subject.
I'll be honest, the python.org conversation about projects that aren't meant to generate a wheel and pyproject.toml gives me pause as I am behind the curve on the intentions and rollout of pyproject.toml (I let my jaded side get the better of me, thinking it was another standardization attempt that wouldn't stick! Mia culpa). In practice, I have been running into exactly what is being discussed there- projects that are never expected to ship a wheel that but are still trying to do the best practices thing: using the modern pyproject.toml file, correctly or no, and with requirements.txt!
Isn't "A file format to record Python dependencies for installation reproducibility" https://peps.python.org/pep-0751/ kind of what you want to achieve @tim-win ? I believe support for it is coming, and it's part of the standard, and a lot of tools already support it so basically you can - at any point "export" your dependencies from your project fairly easily. It's indended for "reproducible installs" so as such it's much closer to "requirements.txt" - that would leave it nicely separated - where "pyproject.toml" is mainly targeted for "develoment" and allows for dynamicness of various sorts, where PEP-751 standardized format will be somethign that projects could not only store for local development but also publish as way to installing the "golden" set of dependencies.
No! I'm working for less requirements files, not more!
No! I'm working for less requirements files, not more!
Surely,. you can use screwdriver to get nail in the wall, but generally using hammer is better.
Build systems that make use of dynamic = ['dependencies'] seem rare
Unfortunately the vast majority of build systems are not public, they are in private organizations repos, so looking at how build systems work for popular PyPI packages and concluding that this the most common case isn't valid.
It appears that 'version' and 'readme' are the most common dynamic fields, but I was at a loss to find dynamic = ['dependencies'].
The examples you have cited are effectively using dynamic = ['dependencies'], but they they are using the setuptools.build_meta:__legacy__ backend to source the metadata rather than explicitly listing the keys as dynamic, as defined by PEP 517: https://peps.python.org/pep-0517/#source-trees
But even if static dependencies are 99.99% of all cases, why should this be implemented instead of https://github.com/pypa/pip/issues/11440? Which would solve 100% of all cases, unless there is some use case I am missing?
I'm a user of the tool stack and maintainer of a few projects of medium - high complexity. I would like a syntax that's simple, obvious and easy to work with, and I'd like to declare my dependencies in one place and have that serve the whole build chain and dev lifecycle. For me, simple and obvious is way better than semantically consistent and fiddly.
pip install --dependencies
pip install --optional feature1 --optional feature2
pip install --optional *
pip install --group devtools --group compliance
pip install --group *
The proposed syntax, pip install --only-deps ".[feature1, feature2]", to me is a bit less clear. It's worth remembering, a lot of times the build system is set up by a dev who's not familiar with the tool and needs to get it done in a hurry - making it as easy as possible to get right will save countless hours of dev time and frustration!
For my projects, I have duplicated dependencies into a group called "core", removed the versions and used a constraints file.
pip install --group core --constraint=constraints.txt
Simple enough, and with no versions the blocks don't change that much. As an extra benefit I can share a constraints file across a set of related packages.
Maybe we can think of pip install -r pyproject.toml[test] as a variant of pip install -e .[test].
so we got:
pip install -r pyproject.toml[test]
pip install -r --only-deps pyproject.toml[test]
pip install -r --only-build-deps pyproject.toml[test]
pip install -r sub_project/pyproject.toml[test]
There is now a community consensus to centralize project settings into pyproject.toml. Using -r pyproject.toml makes it clearer where the setup data comes from.
Problems with pip install -e .[test]
- Explicit is better than implicit.
- Can't install sub-project in editable mode from pyproject.toml(without change CWD)
@rexzhang I think the converging consensus is to use group dependencies, not optional dependencies, to specify non public dependencies such as dev, test, and type hinting: https://packaging.python.org/en/latest/specifications/dependency-groups/
Pip already supports installing dependency groups directly: https://pip.pypa.io/en/stable/cli/pip_install/#cmdoption-group
@rexzhang I think the converging consensus is to use group dependencies, not optional dependencies, to specify non public dependencies such as dev, test, and type hinting: https://packaging.python.org/en/latest/specifications/dependency-groups/
Pip already supports installing dependency groups directly: https://pip.pypa.io/en/stable/cli/pip_install/
~Thanks for your response, the problem I'm currently having is that I don't know how to refer from [dependency-groups] to [project.optional-dependencies]~
sorry, i got it. In CI, install the application dependencies and test dependencies by pip install twice.