harbor icon indicating copy to clipboard operation
harbor copied to clipboard

Getting HTTP 404, "Manifest Unknown" when pushing an image to Harbor

Open ludz-lim opened this issue 3 years ago • 16 comments

I'm getting HTTP 404 "Manifest Unknown" error when pushing an image to a Harbor 2,0 registry

To reproduce, pull docker.elastic.co/elasticsearch/elasticsearch:7.9.3

$ docker pull docker.elastic.co/elasticsearch/elasticsearch:7.9.3
7.9.3: Pulling from elasticsearch/elasticsearch
f1feca467797: Pull complete
39d9d8875a6d: Pull complete
e7a6547bd7a1: Pull complete
a9cfa40c5418: Pull complete
0daf89d8c887: Pull complete
c84049f0f38a: Pull complete
9068e5a839f5: Pull complete
0d768056b087: Pull complete
0d190ac61a88: Pull complete
Digest: sha256:9116cf5563a6360ed204cd59eb89049d7e2ac9171645dccdb1421b55dbae888b
Status: Downloaded newer image for docker.elastic.co/elasticsearch/elasticsearch:7.9.3
docker.elastic.co/elasticsearch/elasticsearch:7.9.3

Tag it as a Harbor image

$ docker tag docker.elastic.co/elasticsearch/elasticsearch:7.9.3 harbor.xxx.yyy.com/money-burner/elasticsearch:1.2

Push it to Harbor registry:

$ docker push  harbor.xxx.yyy.com/money-burner/elasticsearch:1.2
The push refers to repository [harbor.xxx.yyy.com/money-burner/elasticsearch]
b67bd8865612: Pushed
1bbae5fb228e: Pushed
ffdc16fbe640: Pushed
a61b95f80b33: Pushed
748edfc60fb4: Pushed
29d63f7b4dd3: Pushed
f9fc8c4ea4ea: Pushed
c16e52a61918: Pushed
613be09ab3c0: Pushed
unknown: http status code: 404, body: {"errors":[{"code":"MANIFEST_UNKNOWN","message":"manifest unknown","detail":{"Name":"money-burner/elasticsearch","Revision":"sha256:b153829871d83cea307fcfdfb245c9981d8036847064ce89df645a49bcf8c1d9"}}]}

The image seems not corrupt as I was able to docker run:

$ docker run -it  docker.elastic.co/elasticsearch/elasticsearch:7.9.3 /bin/bash
[root@7c8c9b30f266 elasticsearch]# exit
exit

I was also able to do the following which seems to indicate that the image is not corrupted:

$ docker save docker.elastic.co/elasticsearch/elasticsearch:7.9.3 > /dev/null && echo 'OK' || echo 'Corrupted'
OK

The docker manifest inspect command gives me:

$  docker manifest inspect  docker.elastic.co/elasticsearch/elasticsearch:7.9.3
{
   "schemaVersion": 2,
   "mediaType": "application/vnd.docker.distribution.manifest.list.v2+json",
   "manifests": [
      {
         "mediaType": "application/vnd.docker.distribution.manifest.v2+json",
         "size": 1727,
         "digest": "sha256:9ddcb04b7b2e6a18b6c57b055b9af44daeb841efa014c0215beb37ed5d270556",
         "platform": {
            "architecture": "amd64",
            "os": "linux"
         }
      },
      {
         "mediaType": "application/vnd.docker.distribution.manifest.v2+json",
         "size": 1727,
         "digest": "sha256:9f5648fb9a0c8f2672a3c0feb654927edd119dd81ccbf1b8bdcb614547d7fe9c",
         "platform": {
            "architecture": "arm64",
            "os": "linux"
         }
      }
   ]
}

Harbor core logs has following entries:

2021-05-17T15:07:32Z [DEBUG] [/server/middleware/log/log.go:30]: attach request id c87a8e8b-577d-44ef-8994-2fac76dea9b9 to the logger for the request HEAD /v2/money-burner/elasticsearch/blobs/sha256:8f58f7e426faeec7f759789d0841d26ff296fcfda8d64a72e51fc4af8f3e1aea
2021-05-17T15:07:32Z [DEBUG] [/server/middleware/log/log.go:30]: attach request id 21e656cb-8788-4432-95a8-02b948d78742 to the logger for the request HEAD /v2/money-burner/elasticsearch/blobs/sha256:70f1e3a1960ad99468199edd036fe3b559f10fe76fea90b83e0d853bad203eb0
021-05-17T15:07:32Z [DEBUG] [/server/middleware/artifactinfo/artifact_info.go:52]: In artifact info middleware, url: /v2/money-burner/elasticsearch/blobs/sha256:8f58f7e426faeec7f759789d0841d26ff296fcfda8d64a72e51fc4af8f3e1aea
2021-05-17T15:07:32Z [DEBUG] [/server/middleware/artifactinfo/artifact_info.go:52]: In artifact info middleware, url: /v2/money-burner/elasticsearch/blobs/sha256:70f1e3a1960ad99468199edd036fe3b559f10fe76fea90b83e0d853bad203eb0
021-05-17T15:07:32Z [DEBUG] [/server/middleware/log/log.go:30]: attach request id f32e3b06-ea1b-41d2-978a-ddfd72885e4c to the logger for the request HEAD /v2/money-burner/elasticsearch/blobs/sha256:817feb91b55c590cde52765064649bf5b6188541e13c499d977fe1a472893272
021-05-17T15:07:32Z [DEBUG] [/server/middleware/artifactinfo/artifact_info.go:52]: In artifact info middleware, url: /v2/money-burner/elasticsearch/blobs/sha256:817feb91b55c590cde52765064649bf5b6188541e13c499d977fe1a472893272
2021-05-17T15:07:32Z [DEBUG] [/server/middleware/log/log.go:30]: attach request id 6ff587cb-01bf-4cd3-a34a-64bbcfa79e5a to the logger for the request HEAD /v2/money-burner/elasticsearch/blobs/sha256:a19cf707b4fd6d2709e0690ad6d3f9cdfdc92c3f97ca4f193c934d76db03f99e
2021-05-17T15:07:32Z [DEBUG] [/server/middleware/artifactinfo/artifact_info.go:52]: In artifact info middleware, url: /v2/money-burner/elasticsearch/blobs/sha256:a19cf707b4fd6d2709e0690ad6d3f9cdfdc92c3f97ca4f193c934d76db03f99e
2021-05-17T15:07:33Z [DEBUG] [/server/middleware/log/log.go:30]: attach request id bd65cb9b-e441-4964-9393-3b5643d2bd55 to the logger for the request HEAD /v2/money-burner/elasticsearch/blobs/sha256:1ab13f928dc8aa958574d060fde595bd7715c1c2d3260446356a6a02d231e168
2021-05-17T15:07:33Z [DEBUG] [/server/middleware/artifactinfo/artifact_info.go:52]: In artifact info middleware, url: /v2/money-burner/elasticsearch/blobs/sha256:1ab13f928dc8aa958574d060fde595bd7715c1c2d3260446356a6a02d231e168
2021-05-17T15:07:33Z [DEBUG] [/server/middleware/log/log.go:30]: attach request id f25ea0bf-c3df-46af-936d-7d45f3cadcf9 to the logger for the request PUT /v2/money-burner/elasticsearch/manifests/1.2
2021-05-17T15:07:33Z [DEBUG] [/server/middleware/artifactinfo/artifact_info.go:52]: In artifact info middleware, url: /v2/money-burner/elasticsearch/manifests/1.2
2021-05-17T15:07:33Z [DEBUG] [/server/middleware/immutable/pushmf.go:51]: failed to list artifact, repository money-burner/elasticsearch not found
2021-05-17T15:07:33Z [DEBUG] [/lib/http/error.go:59]: {\"errors\":[{\"code\":\"NOT_FOUND\",\"message\":\"http status code: 404, body: {\\\"errors\\\":[{\\\"code\\\":\\\"MANIFEST_UNKNOWN\\\",\\\"message\\\":\\\"manifest unknown\\\",\\\"detail\\\":{\\\"Name\\\":\\\"money-burner/elasticsearch\\\",\\\"Revision\\\":\\\"sha256:b153829871d83cea307fcfdfb245c9981d8036847064ce89df645a49bcf8c1d9\\\"}}]}\\n\"}]

ludz-lim avatar May 17 '21 15:05 ludz-lim

Will running gc fix this issue if the image was delete before?

ludz-lim avatar May 17 '21 16:05 ludz-lim

No, GC deletes the unused blobs which are uploaded 2 hours ago.

stonezdj avatar May 20 '21 14:05 stonezdj

@stonezdj Sorry if I sound ignorant, the GC will will not run unless it is triggered right? I mean the GC does not run the background automatically.

ludz-lim avatar May 24 '21 09:05 ludz-lim

We are also getting this error if we do the following:

docker pull harbor-2.0.somewhere.com/my-image:my-tag
docker login harbor-2.2.somewhere.com -u my_user
docker tag  harbor-2.0.somewhere.com/my-image:my-tag harbor-2.2.somewhere.com/my-image:my-tag
docker push harbor-2.2.somewhere.com/my-image:my-tag

ludz-lim avatar May 24 '21 09:05 ludz-lim

@wy65701436 I believe I found the cause of the issue. Both instances of Harbor are using the same redis Cache. After creating a separate Redis instance for each Harbor instance, the issue disappeared. I'll do more tests in the next coming days to verify it.

ludz-lim avatar Jun 01 '21 01:06 ludz-lim

thanks @ludz-lim what's the Harbor version? I'll set up a env as your mentioned to have a reproduce.

wy65701436 avatar Aug 10 '21 14:08 wy65701436

GC run fixed the issue. But curious to know whats the actual reason?

geowalrus4gh avatar Oct 26 '21 17:10 geowalrus4gh

@wy65701436 I believe I found the cause of the issue. Both instances of Harbor are using the same redis Cache. After creating a separate Redis instance for each Harbor instance, the issue disappeared. I'll do more tests in the next coming days to verify it.

same problem. @wy65701436 v2.4.0-d4affc2e

withlin avatar Mar 28 '22 04:03 withlin

I ran into same issue where imgpkg copy to harbor was failing with MANIFESTS_UNKNOWN error, but actually, the artifacts were copied. After running GC it worked fine.

dhadukk avatar Jun 14 '22 04:06 dhadukk

Same here

Antiarchitect avatar Jun 30 '22 08:06 Antiarchitect

Just an idea: I use KeyDB in master-master replication mode as Redis alternative. Could it be so that preceding layer cache is written to one Redis/KeyDB instance and succeeding tries to be written to another and replication is slower than the write. So succeeding layer cache write do not see the preceding layer. If so we need some stable stickiness.

Antiarchitect avatar Jul 21 '22 09:07 Antiarchitect

Is there any progress on this. Leaving one instance of Redis / KeyDB does nothing. The problem still appears.

Antiarchitect avatar Aug 30 '22 11:08 Antiarchitect

We experienced the same error. We could pull a image. But when trying to push that exact same image it failed with: unknown: http status code: 404, body: {"errors":[{"code":"MANIFEST_UNKNOWN","message":"manifest unknown","detail":{"Name":"repo/image,"Revision":"sha256:c508d750fa8f43581a532a73a750fae148fb765cc8384914a1eb74c88c90bebf"}}]}

I also noticed that the nightly scheduled garbage collection also failed. Failing to delete another image.

I was able to resolve the 404 by logging into redis database 1 of harbor: redis-cli -n 1. Search in all the redis keys to the hash above. Found: "blobs::sha256:c508d750fa8f43581a532a73a750fae148fb765cc8384914a1eb74c88c90bebf"

And then delete that key with: del "blobs::sha256:c508d750fa8f43581a532a73a750fae148fb765cc8384914a1eb74c88c90bebf"

After that we could push our image again. Then ran a manual GC which succeeded again. Still a bit unclear on what caused this.

trancilo avatar Sep 12 '22 13:09 trancilo

@trancilo Possible I've stabilized the situation in my case:

  1. GC start on every Friday evening
  2. Make sure Allow garbage collection on untagged artifacts is turned off

I don't think it is a good solution, but the problem seems gone.

UPD: not works :( Still getting this sporadically

Antiarchitect avatar Sep 12 '22 13:09 Antiarchitect

This issue is being marked stale due to a period of inactivity. If this issue is still relevant, please comment or remove the stale label. Otherwise, this issue will close in 30 days.

github-actions[bot] avatar Nov 13 '22 09:11 github-actions[bot]

@github-actions still relevant

fgierlinger avatar Nov 13 '22 14:11 fgierlinger

Same problem here - anything new to this?

0Styless avatar Dec 14 '22 14:12 0Styless

Same problem here

whoo3474 avatar Feb 02 '23 07:02 whoo3474

Same problem ,what next?

stop-coding avatar Mar 10 '23 11:03 stop-coding

We experienced the same error. We could pull a image. But when trying to push that exact same image it failed with: unknown: http status code: 404, body: {"errors":[{"code":"MANIFEST_UNKNOWN","message":"manifest unknown","detail":{"Name":"repo/image,"Revision":"sha256:c508d750fa8f43581a532a73a750fae148fb765cc8384914a1eb74c88c90bebf"}}]}

I also noticed that the nightly scheduled garbage collection also failed. Failing to delete another image.

I was able to resolve the 404 by logging into redis database 1 of harbor: redis-cli -n 1. Search in all the redis keys to the hash above. Found: "blobs::sha256:c508d750fa8f43581a532a73a750fae148fb765cc8384914a1eb74c88c90bebf"

And then delete that key with: del "blobs::sha256:c508d750fa8f43581a532a73a750fae148fb765cc8384914a1eb74c88c90bebf"

After that we could push our image again. Then ran a manual GC which succeeded again. Still a bit unclear on what caused this.

This worked for me. I did have to do one more thing. I've scheduled the GC to happen as often as possible. It would fail on images which were not found. But it looks like the next GC run will skip that image. This way. Eventually you get a GC which can finish without errors.

trancilo avatar Apr 03 '23 15:04 trancilo

Harbor GC fixed an issue for me

SVronskiy avatar Apr 23 '23 14:04 SVronskiy

This issue is being marked stale due to a period of inactivity. If this issue is still relevant, please comment or remove the stale label. Otherwise, this issue will close in 30 days.

github-actions[bot] avatar Jun 23 '23 09:06 github-actions[bot]

This issue was closed because it has been stalled for 30 days with no activity. If this issue is still relevant, please re-open a new issue.

github-actions[bot] avatar Jul 24 '23 09:07 github-actions[bot]