Directory with an empty pyproject.toml files is built into an sdist+wheel by setuptools 64
setuptools version
setuptools==64.0.1
Python version
3.10.6
OS
RHEL 7
Additional environment information
No response
Description
A directory with an empty pyproject.toml files is built into an sdist+wheel by setuptools 64
Expected behavior
Setuptools should refuse to build a project that does not have a name and version.
How to Reproduce
- touch pyproject.toml
- tree
- cat pyproject.toml
- python -m build
Output
❯ touch pyproject.toml
❯ tree
.
└── pyproject.toml
0 directories, 1 file
❯ cat pyproject.toml
❯ python -m build
* Creating venv isolated environment...
* Installing packages in isolated environment... (setuptools >= 40.8.0, wheel)
* Getting dependencies for sdist...
running egg_info
creating UNKNOWN.egg-info
writing UNKNOWN.egg-info/PKG-INFO
writing dependency_links to UNKNOWN.egg-info/dependency_links.txt
writing top-level names to UNKNOWN.egg-info/top_level.txt
writing manifest file 'UNKNOWN.egg-info/SOURCES.txt'
reading manifest file 'UNKNOWN.egg-info/SOURCES.txt'
writing manifest file 'UNKNOWN.egg-info/SOURCES.txt'
* Building sdist...
running sdist
running egg_info
writing UNKNOWN.egg-info/PKG-INFO
writing dependency_links to UNKNOWN.egg-info/dependency_links.txt
writing top-level names to UNKNOWN.egg-info/top_level.txt
reading manifest file 'UNKNOWN.egg-info/SOURCES.txt'
writing manifest file 'UNKNOWN.egg-info/SOURCES.txt'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md
running check
warning: check: missing required meta-data: name
creating UNKNOWN-0.0.0
creating UNKNOWN-0.0.0/UNKNOWN.egg-info
copying files to UNKNOWN-0.0.0...
copying pyproject.toml -> UNKNOWN-0.0.0
copying UNKNOWN.egg-info/PKG-INFO -> UNKNOWN-0.0.0/UNKNOWN.egg-info
copying UNKNOWN.egg-info/SOURCES.txt -> UNKNOWN-0.0.0/UNKNOWN.egg-info
copying UNKNOWN.egg-info/dependency_links.txt -> UNKNOWN-0.0.0/UNKNOWN.egg-info
copying UNKNOWN.egg-info/top_level.txt -> UNKNOWN-0.0.0/UNKNOWN.egg-info
Writing UNKNOWN-0.0.0/setup.cfg
Creating tar archive
removing 'UNKNOWN-0.0.0' (and everything under it)
* Building wheel from sdist
* Creating venv isolated environment...
* Installing packages in isolated environment... (setuptools >= 40.8.0, wheel)
* Getting dependencies for wheel...
running egg_info
writing UNKNOWN.egg-info/PKG-INFO
writing dependency_links to UNKNOWN.egg-info/dependency_links.txt
writing top-level names to UNKNOWN.egg-info/top_level.txt
reading manifest file 'UNKNOWN.egg-info/SOURCES.txt'
writing manifest file 'UNKNOWN.egg-info/SOURCES.txt'
* Installing packages in isolated environment... (wheel)
* Building wheel...
running bdist_wheel
running build
installing to build/bdist.linux-x86_64/wheel
running install
running install_egg_info
running egg_info
writing UNKNOWN.egg-info/PKG-INFO
writing dependency_links to UNKNOWN.egg-info/dependency_links.txt
writing top-level names to UNKNOWN.egg-info/top_level.txt
reading manifest file 'UNKNOWN.egg-info/SOURCES.txt'
writing manifest file 'UNKNOWN.egg-info/SOURCES.txt'
Copying UNKNOWN.egg-info to build/bdist.linux-x86_64/wheel/UNKNOWN-0.0.0-py3.10.egg-info
running install_scripts
creating build/bdist.linux-x86_64/wheel/UNKNOWN-0.0.0.dist-info/WHEEL
creating '/tmp/pgedam/dist/tmpmpmy7cyv/UNKNOWN-0.0.0-py3-none-any.whl' and adding 'build/bdist.linux-x86_64/wheel' to it
adding 'UNKNOWN-0.0.0.dist-info/METADATA'
adding 'UNKNOWN-0.0.0.dist-info/WHEEL'
adding 'UNKNOWN-0.0.0.dist-info/top_level.txt'
adding 'UNKNOWN-0.0.0.dist-info/RECORD'
removing build/bdist.linux-x86_64/wheel
Successfully built UNKNOWN-0.0.0.tar.gz and UNKNOWN-0.0.0-py3-none-any.whl
This is the same underlying problem as https://github.com/pypa/setuptools/issues/2329, and my position is the same on this topic as https://github.com/pypa/setuptools/issues/2329#issuecomment-1038403752.
I'll reiterate: Setuptools should not build such projects, and should error out instead in situations with no build configuration (whether there's a setup.py or not) -- requiring both a name and a version to be specified.
The current behaviour is confusing, frustrating to explain to users who hit it and setuptools already breaks test configurations for projects (pip's test suite is failing, since we have a test that no setup.py + a pyproject.toml file with no build configuration should fail to build/work with editable installs). Preserving backwards compatibility of borderline non-sensical behaviour that causes user confusion, while not holding other areas of setuptools to the same level of scrutiny is a bad idea IMO.
One temporary solution we found in the meantime was to set the setuptools version in the pyproject.toml file as 63.0.0
Since our last discussion, I've encountered other use-cases where it's been valuable not to have to specify a name or a version for a package. I was troubleshooting a case where I needed to replicate a user's experience with how the package metadata was generated (dist-info vs. egg-info) and I needed a minimal setuptools project against which to test. The name and version of the generated package were irrelevant to the investigation. It was nice to be able to create a package and not have to supply arbitrary and irrelevant concerns, especially when those concerns are dealt with by higher-level abstractions.
Maybe you're right that Setuptools should have a "strict" mode or something similar, perhaps even enabled by default, where it doesn't supply default values.
Still, like I said in #2329 (and thanks for linking that discussion), I'd like to defer this work until after the distutils merge is complete, because the factors are complicated by backward compatibility that distutils supplies.
I'll reiterate that, while being able to omit these fields is convenient for you, a Python Packaging expert who is deeply familiar with setuptools and its failure modes, it's absolutely not clear to anyone who is not deeply familiar with setuptools and a clear error message would be vastly more obvious and is a matter of two lines of text.
those concerns are dealt with by higher-level abstractions
They aren't, and I want to push back against this rationale, which you also used in the previous issue.
You need to either have a setup.py file or a pyproject.toml file. If either exists (or doesn't exist), pip will delegate to setuptools will happily build the project. Heck, if you don't have setuptools installed in an environment which has pip installed, it will delegate to setuptools which will build an empty directory. To that end, setuptools will package up any directory. That's... bad, and not worth the convinience of allowing experts to omit two lines while writing quick-and-dirty tests.
Is there any chance that setuptools agrees to change this behaviour?
I would appreciate if I could get a clear answer to the above. It'll help me decide whether or not it's worth my time to spend energy on this papercut in Python.
Bumping this again -- is there any chance that setuptools would change this behaviour or is @jaraco's rationale [^1] something that all setuptools maintainers agree with?
[^1]: Which is, to be reductive, that omitting a couple of lines in some testing code is worth treating literally any directory as a valid Python project once it reaches setuptools' build logic, and regardless of the user confusion that it causes and subtle failure modes this enforces onto users.
Hi @pradyunsg could you please expand a little bit more which kind of user confusion this behaviour is causing?
There used to be a big source of confusion when older versions of setuptools (with no support for either setup.cfg or pyproject.toml) accidentally got used. Not because of lack of name or version but instead because the specific version of setuptools was uncapable of reading it. Changing the behaviour of newer versions of setuptools will not help on that. (All the examples in #3765 seem to fall into this category).
My understanding is that the behaviour reported here is only triggered when an user or tool is explicitly trying to install/build an arbitrary directory. (I might be wrong here, and it would be nice to know if there are other situations).
With that in mind, my view is that if the user or tool is actively trying to install/build a given path, it is fair to assume the process is intended. Moreover, having an empty pyproject.toml, empty setup.cfg or a setup.py with a bare setup() call is even more evidence to support this assumption.
When combined with Setuptools' name autodetection feature, this behaviour is quite handy for incrementally adding packaging and tests to pre-existing code (which is a nice flow to teach).
Setuptools should refuse to build a project that does not have a name and version.
https://github.com/pypa/pip/issues/12333 is another example of how the failure modes when not doing this look like to users.
Hi @pradyunsg again this looks like an XY problem to me...
It would seem that the build environment in https://github.com/pypa/pip/issues/12333 is missing setuptools-scm. I don't understand how it relates with this issue.
I don't understand how it relates with this issue.
It's the same failure mode -- setuptools is proceeding without configuration for what the version should be. If it had a clean error about a missing version, it would've prompted the user to look into where the version comes from rather than show up on the pip issue tracker confused about why there's a version mismatch. The version mismatch exists because setuptools places an implicit 0.0.0 value into the built package.
To be honest with you, all the problems reported in https://github.com/pypa/setuptools/issues/3765#issuecomment-1379429275 an in https://github.com/pypa/pip/issues/12333 seem to be primarily related to the fact that the build environment is not set up correctly (or contain leaks). That is why my view is that this discussion has been a XY problem for a while.
We can debate that if setuptools had raised an error for the last example it would be easier for users to identify what is the problem[^1], but it is also equally true that the same effect would be achieved if the frontend had identified that the build environment does not contain the correct build dependencies...
Most of the time the users do provide the correct configuration (in https://github.com/pypa/pip/issues/12333, the configuration files are impeccable), the problem is that the build environment does not contain the correct dependencies (or the correct versions of the dependencies) to process it[^2].
[^1]: My opinion is that it would not make it easier to understand, but actually worse: imagine setuptools raises an hypothetical ConfigError("Please provide a 'version' configuration"), the user would be more puzzled because they actually want setuptools-scm to automatically find it out.
[^2]: Or contain isolation leaks...
My opinion is that it would not make it easier to understand, but actually worse: imagine setuptools raises an hypothetical
ConfigError("Please provide a 'version' configuration"), the user would be more puzzled because they actually want setuptools-scm to automatically find it out.
Yes precisely. That would be a good error message to push the user in the direction of asking why setuptools-scm is not doing its job. Much better than silent apparent success that’s actually a failure.
Yes precisely. That would be a good error message to push the user in the direction of asking why setuptools-scm is not doing its job. Much better than silent apparent success that’s actually a failure.
Sorry @merwok, my comment was actually in the opposite direction. I disagree that this would be a good error message.
I think it is not a practical approach for setuptools to go out of its way implementing defensive programming against improperly configured build environments. I also don't like to blame it on empty pyproject.toml, because the reported issues are related so far to XY problems. Empty pyproject.toml seem to be working as expected to be honest.
Maybe in the future we can implement a different way of using plugins, other than entry-points. That would be more explicit and facilitate raising errors without much need of defensive programming.
(I did understand, my comment was going in the opposite direction to your conclusion!)
Urged by pradyunsg at PyCon sprints, I'm revisiting this issue. I've gone back and re-read #2329 and re-reviewed #2817 (which originally implemented stricter validation but which I disabled). I'm tempted to just reverse my prior decision and accept whatever change was proposed. I believe reverting 650ff7f37915bdbcc50585c87c6eeb5626cb7595 would restore pganssle's original design (except that 18 tests fail due to the name being missing).
Incidentally, just before looking at this issue, I was taking another look at another issue and in that issue, I was relying on the default name/version to build a minimal package, where name and version were irrelevant to the concern at hand and a failure would only have distracted from the investigation.
After reverting 650ff7f37915, I tried building a project that relies on auto-discovery and resolves the package name from the Python package name, but it fails the validation:
draft @ tree
.
├── foo
│ └── __init__.py
└── pyproject.toml
2 directories, 2 files
draft @ cat pyproject.toml
[build-system]
requires=['setuptools']
build-backend='setuptools.build_meta'
draft @ py -m pip-run ~/code/pypa/setuptools build -- -m build --no-isolation -s
* Getting build dependencies for sdist...
error: Required package metadata is missing: {'name'}
ERROR Backend subprocess exited when trying to invoke get_requires_for_build_sdist
That means the hook for validation is firing too early (before the name can be inferred from the sources), justifying concerns about the placement of the hook point.
Moreover, even if we were able to apply this change in some form, it wouldn't address the failure mode reported in https://github.com/pypa/setuptools/issues/3511#issuecomment-1753225171, because that one was about an unspecified version, which I feel even more strongly 0 is a reasonable degenerate value.
I'm once again feeling reluctant to special-case name and version as a proxy for a misconfigured project. What about other mis-configuration? Should Setuptools also fail in these cases:
- Classifiers are present but their indicators mis-match other metadata (like license or supported Python versions).
- Classifier doesn't include Python as the programming language or no classifiers are present.
- Package data is present but the files aren't included in any package data declaration.
- Declared dependencies are never resolvable (disjoint specifications like x > y and x < y) or aren't currently resolvable given the packages available in PyPI.
- Requires-Python resolves to versions of Python that don't include the current Python.
- No readme file is provided (currently a warning).
- Project summary looks like a filename (or is a filename that exists) or has no summary.
- "dependencies" is misspelled as "dependences".
Going back to https://github.com/pypa/setuptools/issues/3511#issuecomment-1753225171, I don't think it's the same failure mode as originally reported. This bug is about a "directory with an empty pyproject.toml". If we think narrowly about just that case, I think we agreed that installers should reject that case (they should limit fall back to the legacy backend for legacy cases, i.e. when only a setup.py is present). If users are adding a pyproject.toml, they shouldn't be relying on fallback to __legacy__. Or maybe setuptools should deprecate and enforce such usage.
I think I'd like to focus our efforts first on limiting where setuptools is inferred as the default backend. If we could remove support for setuptools as a default backend, that would fix the problem of an empty pyproject.toml (or one with no backend declared). That still wouldn't address the broader issue of inferring a broken build based on config values. I have some ideas about how we might make it easier to deprecate the default backend that I'll bring up in a separate discussion (aside: does uv also fallback to setuptools legacy?).
Here's what I propose (plan):
- Limit where setuptools is the default backend.
- Identify the correct hook for validating metadata (late, after plugins and other dynamic behavior).
- Make the validation opt-out (through PEP 517 hook config or environment variables) so that tools and developers can easily get degenerate values. (e.g.
SETUPTOOLS_CONFIG={strict,degenerate}) - Opt out in the test suite.
- Implement validation of metadata to require a name and version (no more degenerate values for name or version).
@pradyunsg What do you think of the plan?
@jaraco I am not so sure about this proposal...
What I would like to understand better before we proceed is: what are the real issues with the approach that setuptools currently implements and how specifically are they problematic?
As far as I understand, some problems that have been attributed to the current behaviour are typically unrelated and caused by other reasons. Often the real issue is that an old version of setuptools or other build dependency is being used (either because it is what is currently installed in the environment, or because there are some sys.path leaks, e.g. due to OS-level patches)[^1] and this old version cannot read setup.cfg, pyproject.toml or any other source of metadata. This is the case for the problems reported in https://github.com/pypa/setuptools/issues/3765#issuecomment-1379429275 and https://github.com/pypa/pip/issues/12333. We should be careful to not fall into a XY problem situation.
My thoughts are the following:
- Given that there is a directory
dirwith anpyproject.tomlfile and the users are callingpip install dir(or any other installer), whydirshould not be treated like it is meant to generate a Python distribution?
It seems to me that together these 2 factors (existence of pyproject.toml and calling an installer on the directory) are evidence enough that the user intentionally wants to build a package. Why should we treat it different than that?
Can we have some other reproducers? I don't think that simply creating a new folder with an empty pyproject.toml and explicitly installing it works as a reproducer, after all the user is telling the installer "please install this directory as if it was a package for me" -- so all the tools involved are simply giving the user exactly what the user want... That is not a bug, the tools are doing what they have been told.
In which circumstances that would not be true?
In which unacceptable circumstances[^2] UNKOWN is not caused by the wrong version of build dependencies being used[^1]?
[^1]: Or the build dependency is missing completely.
[^2]: I consider that if the user creates an empty folder then add an empty pyproject.toml and then explicitly tell the installer to install that folder, having UNKOWN as a name is perfectly acceptable. If the user don't want that to happen, an acceptable answer would be "so please don't try to install it".
I also believe that once the following 2 improvements are widely adopted in the ecosystem we are going to start seeing less and less of failures in the style of the ones listed in https://github.com/pypa/setuptools/issues/3765#issuecomment-1379429275:
pipfully embracing thesetuptools.build_meta:__legacy__fallback defined in PEP 517 and stop callingpython setup.py .... This would help because a new build environment would be created by default and therefore chances of outdated or missing build dependencies being used would decrease.[^1]pypa/pyproject-hooks#199is merged and subsequently adopted bypip.pypa/pyproject-hooks#199per-se is less relevant, butpypa/pyproject-hooks#165should help a lot to deal with environment leaks insys.path...
[^1]: But that also opens a different can of worms...
It seems to me that together these 2 factors (existence of
pyproject.tomland calling an installer on the directory) are evidence enough that the user intentionally wants to build a package. Why should we treat it different than that?
My main concern is that I always considered the use of setuptools.build_meta.__legacy__ to be a fallback, meant to support projects with a setup.py and no pyproject.toml. Unfortunately, what's happened is that __legacy__ has come to be used for a whole host of new use-cases, increasing the reliance on this legacy invocation. I'd feel less uneasy about a user asking to build a directory with an empty pyproject.toml if the default backend were something other than a fallback.
Do we really want pip install <dir> or pyproject-build <dir> where contains an empty (or lacking build-system) pyproject.toml to be invoking Setuptools? My instinct is no. I'd really like to move away from cases where setuptools is an implicit build backend, toward an even playing field with every other build backend. If that also addresses the issue of implicitly building a directory with no declared build backend, that addresses two concerns.
Can we have some other reproducers? I don't think that simply creating a new folder with an empty
pyproject.tomland explicitly installing it works as a reproducer, after all the user is telling the installer "please install this directory as if it was a package for me" -- so all the tools involved are simply giving the user exactly what the user want... That is not a bug, the tools are doing what they have been told.In which circumstances that would not be true?
The reproduces Pradyun mentioned were situations where a user had a pyproject.toml for some other purpose (linter config, etc), then were surprised when it built a distribution. I don't recall the specifics. I agree, it does seem like a pretty unlikely scenario.
What I'm aiming to do is narrow the scope of the problem to a single cause (setuptools asked to build something the user didn't intend) versus relying on a proxy within the setuptools build of insufficient specificity of certain config fields.
I think I agree that the scope of this issue is still ambiguous. The title describes "empty pyproject.toml", but other complaints include "improperly configured environment". Pradyun has grouped these concerns as he feels strongly that a project should fail to build if it can't resolve a name and version from somewhere.
I do think it would be preferable for setuptools to fail if the user has declared that "version comes from VCS" or "manifest comes from VCS" but setuptools_scm (or an equivalent) isn't available. Unfortunately, the plugin model and setuptools_scm's configuration model don't make that easy.
Now that I think about it, maybe the issue with the VCS version inference should have been caught by pip (or pyproject-hooks) - if the user is using --no-build-isolation but the declared dependencies aren't satisfied, it should probably fail to build that (or at least warn). I've followed up in that issue.
It sounds like your proposal is to continue to refine some of the rough edges around interactions between tools and see if that largely mitigates the concerns. I'm open to that, and I agree we want to address the problems at the right scope.
This is still valid in setuptools 80.6.0:
$ rm -rf /tmp/st-bug-3511
$ mkdir /tmp/st-bug-3511
$ cd /tmp/st-bug-3511
$ python3.10 -c 'from setuptools import * ; setup()' sdist
running sdist
running egg_info
…
running check
warning: check: missing required meta-data: name
…
Writing unknown-0.0.0/setup.cfg
creating dist
Creating tar archive
removing 'unknown-0.0.0' (and everything under it)
I'll reiterate that I regularly see this in my interactions with Python developers "in the wild". uv prints a warning in these cases, because setuptools will happily package a random directory into a wheel regardless of whether there's any metadata available/indicating that it should do so.
This also came up again on discuss.python.org today, with a different angle: https://discuss.python.org/t/do-we-want-to-keep-the-build-system-default-for-pyproject-toml/104759
This also came up again on discuss.python.org today, with a different angle: https://discuss.python.org/t/do-we-want-to-keep-the-build-system-default-for-pyproject-toml/104759
In my view, this is another contrived example. In this specific case, setuptools will explicitly refuse to build the project, which seems to be the OP’s intended outcome. The error message also provides details about which metadata is missing and how to supply it.
I want to reiterate my position: when someone runs pip install ., the only reasonable assumption for setuptools is that the user intends to install the contents of that directory (and that is applicable for both human users or automated systems). The build system should respond accordingly and aim to produce clear, actionable error or warning messages aligned with that assumption.
Arguments that imply that setuptools should fail in this scenario because the user "did not actually want to install the project" feel very weak to me.
This also came up again on discuss.python.org today, with a different angle: https://discuss.python.org/t/do-we-want-to-keep-the-build-system-default-for-pyproject-toml/104759
That sounds like a bad idea -- the default backend exists for very good reasons, it needs to stay in order to make sure that the large amount of existing published software continues to build in the manner in which it was originally published.
A better idea would be to lean fully into the fact that it's a legacy compatibility mode, and pin the setuptools version used by the default fallback to setuptools==65.*
Projects without a build-backend specified, would probably do better with older setuptools versions anyway -- rather than newer versions that deprecate and remove functionality those old projects used and never migrated off of.
Arguments that imply that setuptools should fail in this scenario because the user "did not actually want to install the project" feel very weak to me.
That's the exact case that's the problem here. There is no other case. IMO, this is assuming the user has a level of understanding of the tooling that they do not always have.
To err on over-communicating: what I've been arguing is that in general, it is a bad experience to have "there is no build-related metadata" -> "a build happens anyway". Responses here have been to the tune of "why? You asked for a build" to which the response of "the user doesn't know enough" wasn't sufficient, and every case that I've brought up where I've personally seen this confuse users and make them frustrated at the Python packaging ecosystem writ large was deemed contrived in this thread.
I'm actually gonna go ahead and close this out because, as far as I can tell, setuptools' maintainers are not open to changing their opinions/positions on this. It seems like you'd like to reach/leave a different end-state than I'm proposing here, so please file a separate issue for whatever that is supposed to be. If that's not the case, please feel welcome to reopen and let me know what exactly it would take to convince you to change things - even if it's only in the legacy backend.
FWIW, I've filed https://github.com/pypa/packaging.python.org/pull/1944 which would be compatible with the "don't trigger a build if you don't want one" approach for setuptools while allowing the frontend to fix the UX issue here around unintended package builds.