buildkit icon indicating copy to clipboard operation
buildkit copied to clipboard

Docker BuildKit caching w/ --cache-from fails (roundly 50% rate), even when using docker-container

Open bozaro opened this issue 2 years ago • 25 comments

Version information:

  • OS: Ubuntu 20.04.2 LTS, x86_64
  • Docker Server: 20.10.7
  • Docker Client: 20.10.7
  • docker buildx version: v0.5.1-docker
  • BuildKit: moby/buildkit:v0.9.0

Steps to reproduce

Create Dockerfile:

FROM busybox AS stage-1
RUN echo "Hello, world!!!"
COPY changed.txt /opt/changed.txt

FROM busybox
COPY --from=stage-1 /opt/changed.txt /opt/changed.txt

Run script like (REGISTRY shoud be replaced by actual value):

#!/bin/bash
# Recreate builder for clear local cache
docker buildx rm cachebug || true
docker buildx create --name cachebug --driver docker-container
docker buildx inspect cachebug --bootstrap

# Create some changed file
date > changed.txt

# Run 
REGISTRY=registry.example.net/test-docker/example
docker buildx build \
    --builder cachebug \
    --push \
    --tag $REGISTRY:latest \
    --cache-from type=registry,ref=$REGISTRY:buildcache \
    --cache-to type=registry,ref=$REGISTRY:buildcache,mode=max \
    --platform linux/amd64 \
    --platform linux/arm64 \
    .

What I see: When I run the above script multiple times, step RUN echo "Hello, world!!!" fails cache roundly every second time for one of platform (I have not seen the problem with the cache at the same time on all platforms).

For example:

 => CACHED [linux/arm64 stage-1 2/3] RUN echo "Hello, world!!!"                                      0.3s
 => => sha256:e2f4ee50b555089a69b84af6621283565af19e3bcf0596b36ba5feec7b96d1d7 116B / 116B           0.2s
 => => sha256:38cc3b49dbab817c9404b9a301d1f673d4b0c2e3497dbcfbea2be77516679682 820.69kB / 820.69kB   0.6s
 => => extracting sha256:38cc3b49dbab817c9404b9a301d1f673d4b0c2e3497dbcfbea2be77516679682            0.1s
 => => extracting sha256:e2f4ee50b555089a69b84af6621283565af19e3bcf0596b36ba5feec7b96d1d7            0.1s
 => [linux/amd64 stage-1 2/3] RUN echo "Hello, world!!!"                                             0.3s
 => [linux/amd64 stage-1 3/3] COPY changed.txt /opt/changed.txt                                      0.2s
 => [linux/arm64 stage-1 3/3] COPY changed.txt /opt/changed.txt                                      0.2s

Update (2021-08-18)

Repository to reproduce issue: https://github.com/bozaro/buildkit-2279 (simply checkout and run ./test.sh).

bozaro avatar Jul 26 '21 08:07 bozaro

I make more investigation: looks like cache is corrupted before upload stage. I comapre bad and good cache data (https://drive.google.com/drive/folders/1hzMWF_qBANvFmf3BeuQKe7KToGUOJp72?usp=sharing) and found litte difference.

Found difference

Good cache

cache_good contains two layers for RUN step:

  • sha256:48ffe3fe97d4a7f3ad3e09f613161e6f1a4f6b836751f0f0c93c8fd5ea92064a (linux/arm64)
  • sha256:52af553f4ee5a60ea433453c95375e457f139988034d49244afcc64b08e3331e (linux/amd64)

Bad cache

cache_bad contains only one layer for RUN step:

  • sha256:52af553f4ee5a60ea433453c95375e457f139988034d49244afcc64b08e3331e (linux/amd64)

Layer graph

graph
good --> g_38cc3[38cc3: FROM arm64]
good --> g_b71f9[b71f9: FROM amd64]
g_38cc3 --> g_48ffe[48ffe: RUN]
g_b71f9 --> g_52af5[52af5: RUN]
g_b71f9 --> g_7c10d[7c10d: COPY]
g_52af5 --> g_7c10d
g_38cc3 --> g_7c10d
g_48ffe --> g_7c10d

bad --> b_38cc3[38cc3: FROM arm64]
bad --> b_b71f9[b71f9: FROM amd64]
b_b71f9 --> b_2b352[2b352: COPY]
b_38cc3 --> b_2b352
b_b71f9 --> b_52af5[52af5: RUN]
b_52af5 --> b_ca41e[ca41e: COPY]

image

bozaro avatar Jul 28 '21 07:07 bozaro

I modified script to check clean cache generation (without --cache-from):

#!/bin/bash
REGISTRY=registry.example.net/test-docker/example

# Recreate builder for clear local cache
docker buildx rm cachebug || true
docker buildx create --name cachebug --driver docker-container
docker buildx inspect cachebug --bootstrap

# Create some changed file
date > changed.txt

# Run with cache-to only
docker buildx build \
    --builder cachebug \
    --cache-to type=registry,ref=$REGISTRY:buildcache,mode=max \
    --platform linux/amd64 \
    --platform linux/arm64 \
    .

# Recreate builder for clear local cache
docker buildx rm cachebug || true
docker buildx create --name cachebug --driver docker-container
docker buildx inspect cachebug --bootstrap

# Create some changed file
date > changed.txt

# Run with cache-from only
docker buildx build \
    --builder cachebug \
    --cache-from type=registry,ref=$REGISTRY:buildcache \
    --platform linux/amd64 \
    --platform linux/arm64 \
    .

And this issue is still reproduced randomly.

bozaro avatar Jul 28 '21 07:07 bozaro

The problem seems to occur when RUN step produce same BLOB for differ platforms (started and /proc is created at the same second) and COPY step product differ BLOBs (started and /opt is created at the differ second).

So, I modify scripts to dramatically increase the reproducing the problem.

Dockerfile:

FROM busybox AS stage-0

FROM stage-0 AS stage-1
RUN echo "Hello, world!!!"; if [ "$(arch)" = "x86_64" ]; then sleep 2; fi
COPY changed.txt /opt/changed.txt

FROM busybox
COPY --from=stage-1 /opt/changed.txt /opt/copy.txt

Test script:

#!/bin/bash
REGISTRY=registry.example.net/test-docker/example

# Recreate builder for clear local cache
docker buildx rm cachebug || true
docker buildx create --name cachebug --driver docker-container
docker buildx inspect cachebug --bootstrap

# Create some changed file
date > changed.txt

# Build stage-0 to pull busybox image
docker buildx build \
    --builder cachebug \
    --platform linux/amd64 \
    --platform linux/arm64 \
    --target stage-0 \
    .

# Run with cache-to only
docker buildx build \
    --builder cachebug \
    --cache-to type=registry,ref=$REGISTRY:buildcache,mode=max \
    --platform linux/amd64 \
    --platform linux/arm64 \
    .

# Recreate builder for clear local cache
docker buildx rm cachebug || true
docker buildx create --name cachebug --driver docker-container
docker buildx inspect cachebug --bootstrap

# Create some changed file
date > changed.txt

# Run with cache-from only
docker buildx build \
    --builder cachebug \
    --cache-from type=registry,ref=$REGISTRY:buildcache \
    --platform linux/amd64 \
    --platform linux/arm64 \
    .

bozaro avatar Jul 28 '21 09:07 bozaro

I try save stage-1, latest and buildcache and recover lost cache path (https://drive.google.com/drive/u/0/folders/1pgE0BIQo6PlldnWSDUBtP99xcSvtQAlN).

graph
style copy_stage_amd64 stroke-dasharray: 5 5

root --> amd64[b71f9: FROM amd64]
root --> arm64[38cc3: FROM arm64]
amd64 --> copy_latest_amd64[62dd4: COPY latest, amd64]
amd64 -.-> run[1bb1a: RUN]
arm64 --> run
arm64 --> copy_latest_arm64[f6510: COPY latest, arm64]
run -.-> copy_stage_amd64[01cda: COPY stage-1, amd64]
run --> copy_stage_arm64[f253d: COPY stage-1, arm64]

image

In this case I got single layer for RUN command and differ layers for COPY command. Dashed links and layers was lost in cache.

bozaro avatar Jul 28 '21 11:07 bozaro

I independently encountered some bugs around the parallel creation of blobs for different platforms and have a fix out for PR here: https://github.com/moby/buildkit/pull/2273

Have not tried the reproducer here to see if it fixes the problem, but may be worth a try if anyone has time.

sipsma avatar Jul 28 '21 18:07 sipsma

I refactor Dockerfile to produce stable layer on RUN stage:

FROM alpine AS base
# RUN command always produce layer with digest `sha256:bcca0bb7...`
RUN echo "Hello, world!!!"; if [ "$(arch)" = "x86_64" ]; then sleep 2; fi; touch -d 2021-01-01 -t 00:00:00 /foo.txt
# COPY command produce differ layer per platform because /opt has differ change time
COPY changed.txt /opt/changed.txt

FROM alpine
COPY --from=base /opt/changed.txt /opt/copy.txt

And remove stage-0 build from script. But docker buildx build still produce cache with incomplete layers graph randomly (I got 53 failures on 100 runs).

bozaro avatar Jul 29 '21 06:07 bozaro

I independently encountered some bugs around the parallel creation of blobs for different platforms and have a fix out for PR here: #2273

Have not tried the reproducer here to see if it fixes the problem, but may be worth a try if anyone has time.

I have build locally #2273 and looks like it not fix this issue.

bozaro avatar Jul 29 '21 06:07 bozaro

I have build locally #2273 and looks like it not fix this issue.

Ah okay, good to know, thanks for giving it a try

sipsma avatar Jul 29 '21 06:07 sipsma

I created script to one click reproduce: https://github.com/bozaro/buildkit-2279

bozaro avatar Aug 18 '21 08:08 bozaro

I recheck this issue state by https://github.com/bozaro/buildkit-2279 script and now it not flaky. It stable failed:

e134e8a1d9522d24917077a359c77e7428586adb7be11d16d4c886902c2ecb3b
Wait for service on endpoint: http://localhost:5001
bd194edf668ff7df00eedecb5f9b346af23405cc4e7bf0ce12f91b6aefc59da9
Wait for service on endpoint: http://localhost:5002
Check is build for multiple platform supported...
Run tests...
[1] FAIL
...
[100] FAIL
Cleanup docker images...
cachebug_registry
cachebug_proxy
Show statistics...
FAILED 100 OF 100

bozaro avatar Oct 26 '21 07:10 bozaro

Given the resolution of #1981 , could you try upgrading your buildx plugin to the 0.8.2 version?

n1ngu avatar Jul 21 '22 14:07 n1ngu

Has this issue proceeded any further or is there still no known fix? How are people working around this?

benlangfeld avatar Aug 18 '22 01:08 benlangfeld

my fix was simply to not use multi-stage builds, but that of course is not a useful solution for everybody

ghost-of-montevideo avatar Aug 31 '22 10:08 ghost-of-montevideo

@n1ngu I tried upgrading and the issue persists, multi-stage is broken

ghost-of-montevideo avatar Aug 31 '22 10:08 ghost-of-montevideo

Ok @ghost-of-montevideo , just check you are not actually under the influence of the confusingly similar #2274

I was under the impression that the present issue is fixed and only affects the outdated buildkit vendored out-of-the-box within docker, as I was unable to reproduce it if I was using an updated buildkit plugin and the docker-container build driver. Unlike the present issue, the reproduction of #2274 is consistent for me.

n1ngu avatar Sep 02 '22 16:09 n1ngu

@n1ngu The reproduction in https://github.com/bozaro/buildkit-2279 shows that with buildx version 0.9.1 and moby/buildkit:0.10.4 the failure rate is 100% on this issue. Can you show how you were able to validate that this is working, as stated in your last comment?

benlangfeld avatar Oct 11 '22 13:10 benlangfeld

@n1ngu The reproduction in https://github.com/bozaro/buildkit-2279 shows that with buildx version 0.9.1 and moby/buildkit:0.10.4 the failure rate is 100% on this issue. Can you show how you were able to validate that this is working, as stated in your last comment?

@benlangfeld I can't access the building environment I used at the time, but I now think the culprit of the issue would be multi-stage builds, as @ghost-of-montevideo pointed out. All my builds were single-stage so it worked right. I just feel this isn't highlighted enough in the issue description.

n1ngu avatar Oct 13 '22 11:10 n1ngu

I just feel this isn't highlighted enough in the issue description.

Initial issue message has "Steps to reproduce" section. What additional highlighting do you expect?

slonopotamus avatar Oct 13 '22 14:10 slonopotamus

Initial issue message has "Steps to reproduce" section. What additional highlighting do you expect?

Given the context of #1981 and #2274, I'd expect the "multi-stage build triggers the problem, single-stage build works fine" bit of information (not literally) to be both in the title and the description, not only in the reproduction source code.

n1ngu avatar Oct 14 '22 09:10 n1ngu

Is there a solution to this yet? I am also using multistage builds and encountering this issue.

kkmoslehpour avatar Mar 17 '23 23:03 kkmoslehpour

Is there anybody working on this issue, and is there any way others can help? Either this or the related https://github.com/moby/buildkit/issues/2274

neilsh avatar Jul 12 '23 16:07 neilsh

Im also happy to help a bit if there is a way we can contribute, let us know

ghost-of-montevideo avatar Jul 13 '23 09:07 ghost-of-montevideo

if there is a way we can contribute, let us know

You can debug the code, find where is the root of the issue, fix it and submit a PR.

slonopotamus avatar Jul 13 '23 11:07 slonopotamus

Spent a while on this issue and could not find a great resolution using buildkit, and there's lots of unsolved issues about this across the web. After seeing https://github.com/moby/buildkit/issues/1981#issuecomment-1516704608 and another on SO, I switched our GitLab Cloud CI/CD pipelines to use buildah instead of buildkit and caching is working well now.

It's (almost) a drop-in replacement, and using overlay2 as the storage driver, the build performance (caching disabled) seems to be the same as with buildkit.

Here's a sample of what that could look like in a .gitlab-ci.yml file:

.build_template: &build_template
  image: quay.io/buildah/stable
  before_script:
    - buildah login [Login Params] [Registry URL]
  script:
    - buildah build
      --cache-from registry.example.com/myproject/myapp
      --tag registry.example.com/myproject/myapp:${CI_COMMIT_SHA}
      --cache-to registry.example.com/myproject/myapp
      --layers
      --storage-driver overlay2
      -f $DOCKERFILE_PATH $BUILD_CONTEXT_PATH
    - buildah push registry.example.com/myproject/myapp:${CI_COMMIT_SHA}

alexwilson1 avatar Dec 18 '23 16:12 alexwilson1

I recheck this issue state by https://github.com/bozaro/buildkit-2279 script and now it not flaky. It stable failed:

e134e8a1d9522d24917077a359c77e7428586adb7be11d16d4c886902c2ecb3b
Wait for service on endpoint: http://localhost:5001
bd194edf668ff7df00eedecb5f9b346af23405cc4e7bf0ce12f91b6aefc59da9
Wait for service on endpoint: http://localhost:5002
Check is build for multiple platform supported...
Run tests...
[1] FAIL
...
[100] FAIL
Cleanup docker images...
cachebug_registry
cachebug_proxy
Show statistics...
FAILED 100 OF 100

I tried your script and I only got 100 of 100 when I exhausted the number of requests to docker hub, where you logged in when you tried?

fermixbook avatar Mar 13 '24 00:03 fermixbook