twine
twine copied to clipboard
Add a --dry-run mode
When building the Debian package version of twine, I'd love to have an end-to-end test that I can automate which actually hits PyPI, but I don't want to do an actual upload and I don't want to have to put real credentials in the package.
One idea would be to add a --dry-run mode which would allow me to pretend to upload a fake package with fake credentials, and would report back if the upload would have been successful if both of those were real data.
Any other suggestions for allowing me to automate such a test would be welcome!
Maybe something like #19 (which i haven't looked at yet, but has conflicts).
One idea would be to add a --dry-run mode which would allow me to pretend to upload a fake package with fake credentials, and would report back if the upload would have been successful if both of those were real data.
I'm confused. There are very few failure modes of twine. I'm not sure something that doesn't talk to PyPI (or something like it) could be useful, but I'm not sure what else to offer you beyond a test server.
Frankly what I'm imagining would be more useful is something that provides a local URL, credentials, and allows you to actually use Twine as an E2E test. A test fixture along the lines of pytest-httpbin
.
Yep, this would be a sanity check to make sure the built Debian package didn't break something. E.g. maybe we left off a new dependency, or broke a delta patch we might need against upstream. In Debian and Ubuntu we have nice infrastructure to automatically run tests, not just at build time which usually run the package's own test suite, but after build where the .debs are installed in a chroot and tested. FWIW, build-time tests can't touch the internet, but after-build tests can.
A test/staging server which just threw away the data could be useful, as would be a fixture as you describe. The latter would only test localhost, but that might be enough. And I suppose that fixture wouldn't necessarily have to go in twine, although it would be nice to have.
(FWIW, we do something similar in the Mailman test suite where we create a fake local http to talk to during various tests.)
Maybe @dstufft has other thoughts, e.g. perhaps there's something in Warehouse we can fake-talk to?
Nah, there's nothing in Warehouse to fake-talk to. You'd have to create your own fake server, but the upload API isn't very complicated if you don't care about doing the "make sure permissions and the valid contained within the package" is valid stuff.
I think all I'd want is to verify that there are some credentials (e.g. the package has a valid signature but that signature needn't be associated with any particular user), and that the package looks sane-ish, meaning the file has the right format (e.g. no .tbz extension
I think all I'd want is to verify that there are some credentials (e.g. the package has a valid signature but that signature needn't be associated with any particular user)
So, twine only verifies credentials to PyPI (if you think using them as authentication is verification). We do not verify signatures and PyPI/Warehouse do not associate signatures with particular users.
and that the package looks sane-ish, meaning the file has the right format (e.g. no .tbz extension )
Twine doesn't look to enforce prohibited file types (especially since the PEP has yet to be accepted which adds those restrictions). Once we do, I would expect that simply providing that file would fail before even talking to the internet (thereby not requiring a server of any sort).
When I want a dry run of a PyPI upload, here's what I want to check:
-
what versions of what things am I uploading? Have I accidentally said
dist/*
when that directory has a test1.0
in it that I do not want to upload and I should be specifying 1 or 2 particular files instead? Have I made a wheel or just an sdist? Isdist/*
completely empty? -
where will twine attempt to get credentials from? Env variables,
.pypi.rc
, keyring, somewhere else? -
what's in the key setup.py fields? Package name, version number, short description.
-
is the repo/repo URL complete nonsense? Is there some semblance of hope here that it's a reachable URL?
I'd welcome @michaeljoseph's point of view here since changes
implements a dry-run for uploads.
I'd like to better understand what happened with #19 - was it more of a WONTFIX, or was it just closed because it was stale, or something else? cc @sigmavirus24
It was closed 4 years after and was no longer relevant.
Listing the items for upload seems reasonable. I think everything else that you're talking about is actually a totally separate command. Something like twine package:info
and twine verify:repository
. I think when we're thinking about printing where we get credentials, we might want to just make that part of output printed by a flag like --debug
or --verbose
The draft guide for gracefully dropping support for older Python versions includes a step where you check that your metadata is as you intended before you actually hit publish: https://github.com/pypa/python-packaging-user-guide/pull/459/files#diff-844d448f2675b425642dad8328eaff9eR60
The current draft of that section is setuptools
-specific, since it relies on checking PKG-INFO
in the tarball. It would be much nicer if it could instead just say to run twine upload --dry-run dist/*
and look for the Python-Requires
header.
While this technically could be a separate command, I think a potential side benefit of incorporating it into upload
is to help make it clearer that it's the client that extracts the metadata, and sends it to the server along with the artifacts - the server doesn't extract it from the artifacts.
I really don't like printing out random things in --dry-run
. I understand your point, @ncoghlan but I doubt most folks will understand that nuance, frankly. While python packaging is improving, most people still don't understand it. Printing that out at that point in time will only serve to further confuse people.
Yeah, that's a fair criticism, and I'm not especially hung up on the twine upload --dry-run
spelling. I'd also be entirely happy with a spelling like twine check-release
. By default, that could check both configured credentials (checking the ability to upload a release) and any packages given on the command line (checking the metadata for a release, including whether or not the release had already been published).
Thanks @niothiel for adding some more thoughts in #331 about what they would like in a similar option.
Yes please.. a --dry-run
would be extremely useful
I think this would be very useful when working on CICD.
There doesn't seem to be a single agreed upon set of constraints here. I'm not sure there is value here in the various proposals to be totally honest. @bhrutledge do you agree?
I agree. From a quick scan, it seems like uploading to https://test.pypi.org/ first might cover some of the asks. Also, it seems like this would require support from Warehouse.