warehouse
warehouse copied to clipboard
Upload blocked by a pre-existing file that does not exist.
Describe the bug
While uploading wheels for a new version of 11.8.3, wheel upload was blocked for python 3.9 linux-x64 reporting
400 File already exists. See https://pypi.org/help/#file-name-reuse for more information.
The security log confirms that the version was created today and the linux-x64 file has not been uploaded today. The rest of the wheel uploads were completed successfully.
Expected behavior
Wheel should upload.
To Reproduce
Uploading the wheel for cuda-python 11.8.3 for python 3.9 for x86_64 i.e. cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
My Platform
Upload was attempted using twine with the following command:
python -m twine upload .\cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl --verbose
Additional context
@di is there an easy way to clear the error status on PyPI?
As far as we can tell PyPI shows no record of this file being uploaded, but the CLI still gives us this error. Unclear why that is happening
Can you provide the SHA-256 hash of the file you're trying to upload? E.g.:
$ sha256sum cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Hi @di I missed your response, sorry about that:
$ sha256sum cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
04465e1179213e14a316f0eedbb50dc416f701f3f135d4ba16ce4b2892ec4286 cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Friendly nudge @di 😅
Oddly I see no record of this file already existing:
warehouse=> select id from release_files where filename = 'cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl';
id
----
(0 rows)
warehouse=> select id from release_files where sha256_digest = '04465e1179213e14a316f0eedbb50dc416f701f3f135d4ba16ce4b2892ec4286';
id
----
(0 rows)
Here's the function that checks for duplicate files: https://github.com/pypi/warehouse/blob/65033eb8ba8aaa3652e7af0b4128c60213b508c2/warehouse/forklift/legacy.py#L334-L362
The one last thing to check would be the blake2_256_digest, although I don't see how that would be present if the SHA is not, but could you do:
$ b2sum -a blake2bp -l 256 cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
@di I don't have a -a option on b2sum. Here's my output
$ b2sum -l 256 cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
4642743c6a44b63cb592786a80a0aac21e8eeed8004dafc8b4919284a4a30295 cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Looks like building latest from BLAKE2 adds that option. Here's what I got:
$ ./b2sum -a blake2bp -l 256 cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
f74f53dca84c8122f138662787008602d20c2da109749e1746581bd1151874c0 cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
I don't see anything with that filename or any of those hashes. Are you sure this is the file that is giving this error? Can you share the output of any logs, the output of twine with --verbose set, or the actual artifact in question here? Thanks!
I can't upload the .whl directly, so uploading a .zip of the .whl cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.zip
@di Checking back, is there anything else you need here?
@m3vaz Is that actually the same file? I get different hashes for it:
$ wget https://github.com/user-attachments/files/16353395/cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.zip
...
$ sha256sum cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.zip
2a2b5ab9632a80eadea717c28e6e39d2cb25d8aad7c63ad838cec3c5cf62c39c cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.zip
$ b2sum -l 256 cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.zip
42311abde9310ebe88fa1c47d4f1c1a3facb5cf133591efcc2b9900205618729 cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.zip
$ cat hash.py
import hashlib
import sys
filename = sys.argv[1]
file_hashes = {
"md5": hashlib.md5(usedforsecurity=False),
"sha256": hashlib.sha256(),
"blake2_256": hashlib.blake2b(digest_size=256 // 8),
}
for k, v in file_hashes.items():
with open(filename, "rb") as f:
digest = hashlib.file_digest(f, lambda: v)
print(k, digest.hexdigest())
$ python hash.py cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.zip
md5 90085289c3cb48b1eae77e81b17f3c7b
sha256 2a2b5ab9632a80eadea717c28e6e39d2cb25d8aad7c63ad838cec3c5cf62c39c
blake2_256 42311abde9310ebe88fa1c47d4f1c1a3facb5cf133591efcc2b9900205618729
The filename cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl nor any of those hashes have been seen by PyPI before as far as I can tell.
To take a step back: this usually happens when a user is attempting to upload the same file with two different filenames, usually by renaming the distribution without rebuilding it. Is it possible that's what's happening here?
@di the zip contains the wheel (it is not merely renaming the extension)
Should add had the same thought as you until downloading it and decompressing it
Aha, that'll do it:
$ python hash.py cuda_python-11.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
md5 29dbc0595075a5ad04ec842682774280
sha256 04465e1179213e14a316f0eedbb50dc416f701f3f135d4ba16ce4b2892ec4286
blake2_256 4642743c6a44b63cb592786a80a0aac21e8eeed8004dafc8b4919284a4a30295
Still no reference of this file though. Do you mind if I use my administrator privileges to try uploading it to the project to reproduce? Is there any reason you would not want to publish it?
I think if the end goal is to publish this, it should be ok for you to try to upload it.
@di Please go ahead.
Strange, I had no issues uploading this: https://pypi.org/project/cuda-python/11.8.3/#files.
Thanks Dustin! 🙏
Confusing why we ran into issues then
Regardless am happy we were able to resolve this one 🥳 Thanks again! 🙏
Attaching the screenshot of the uploaded package with the matching hashes
Screenshot
Thanks Dustin!
I think we can close the issue.
Closing.