mamba
mamba copied to clipboard
Mamba tries to download non-existant .conda URLs from conda-meta
Mamba is trying to download non-existant URLs:
$ docker run --pull always -it --rm condaforge/mambaforge:latest
(base) root@12a9a8180829:/# mamba install --yes python==3.8.* conda-build
....
CondaHTTPError: HTTP 404 NOT FOUND for url <https://conda.anaconda.org/conda-forge/noarch/urllib3-1.26.8-pyhd8ed1ab_1.conda>
Elapsed: 00 :00.253499
CF-RAY: 6ddf15f44a4a01db-ZRH
An HTTP error occurred when trying to retrieve this URL.
HTTP errors are often intermittent, and a simple retry will get you on your way.
CondaHTTPError: HTTP 404 NOT FOUND for url <https://conda.anaconda.org/conda-forge/noarch/requests-2.27.1-pyhd8ed1ab_0.conda>
Elapsed: 00:00.276918
CF-RAY: 6ddf15f60e97021d-ZRH
I suspect this is being caused by the change to use transmutation to .conda in miniforge though I'm not sure why mamba is trying this. conda install works without issue.
When the installer completes it deletes the pkgs/ directory by default so the only place these URLs are mentioned are in conda-meta/:
$ grep -r pip-22.0.3-pyhd8ed1ab_0.conda /opt/conda/
/opt/conda/conda-meta/pip-22.0.3-pyhd8ed1ab_0.json: "fn": "pip-22.0.3-pyhd8ed1ab_0.conda",
/opt/conda/conda-meta/pip-22.0.3-pyhd8ed1ab_0.json: "url": "https://conda.anaconda.org/conda-forge/noarch/pip-22.0.3-pyhd8ed1ab_0.conda",
Actually I was slighly mistaken, the installer doesn't clean $PREFIX/pkgs/ automatically however the minforge docker images do.
hmmm, that might be an interesting failure mode. I think when we don't find the selected package in the repodata.json with the same hash we might consider it wrong / outdated and thus try to re-download it.
I guess we need to think about how to handle this situation.
Argh, ok, I looked into it with the docker image you posted, and I think I am beginning to understand... mamba uses the installed packages for solving and assumes that it can either find them in the pkgs cache or otherwise that the URL is valid.
Mamba prefers the installed packages and then conda attempts to download them since it can't find them in the package cache.
I need to check -- we might have to fix a bit the interaction in utils.py where the txn is created for conda. Somewhere there I think the package (e.g. requests) is re-added to the linking ...
Ah, now I get it. Conda adds the noarch packages for re-linking since they need to be re-compiled for the different python version :/
I guess we gotta figure out how to work around this but it's slightly tricky! Maybe we can make some adjustments when we find a package that only exists in one of the two forms in repodata.
Otherwise the two clean solutions I see:
- keep all
noarchpackages in pkgs folder - upload the
.condapackages to the actual channel (I guess we need to wait until anaconda.org is ready for that)
keep all noarch packages in pkgs folder
We still hit the issue if you do mamba clean. Hypothetically we can also hit the issue if packages are marked as broken as far as I can tell there is no good way of getting out of this state. (Actually maybe --force-reinstall works? If so maybe there can be an improved error message so people at least know how to rescue their state.)
upload the .conda packages to the actual channel (I guess we need to wait until anaconda.org is ready for that)
Given this is hopefully soon, maybe this is an option and we abandon transmutation for miniforge for now.
Maybe we can make some adjustments when we find a package that only exists in one of the two forms in repodata.
I think this is my favourite option if it's practical.