coolify
coolify copied to clipboard
[Bug]: /artifacts/build.sh times out attempting to load .dockerignore & Dockerfile
Description
Using Coolify Cloud & v4.0.0-beta.323
All deployments are suddenly stuck loading at & time out after a long while:
[2024-Aug-20 10:05:11.823831] ----------------------------------------
[2024-Aug-20 10:05:11.833426] Building docker image started.
[2024-Aug-20 10:05:11.839621] To check the current progress, click on Show Debug Logs.
[2024-Aug-20 10:05:17.033470]
[COMMAND] docker exec jw4k848wo8s0og0k8kgo08s8 bash -c 'bash /artifacts/build.sh'
[OUTPUT] #0 building with "default" instance using docker driver #1 [internal] load .dockerignore
[2024-Aug-20 10:05:26.929672]
[COMMAND] docker exec jw4k848wo8s0og0k8kgo08s8 bash -c 'bash /artifacts/build.sh'
[OUTPUT] #1 ... #2 [internal] load build definition from Dockerfile
Revalidating server works fine but doesn't fix the issue. I've raised this issue in the Discord and one more user has the same sudden issue so far.
Minimal Reproduction (if possible, example repository)
Deploying any Dockerfile on Coolify Cloud (tried Nixpacks aswell)
Exception or Error
No response
Version
v4.0.0-beta.323
Cloud?
- [X] Yes
- [ ] No
Your disk is probably full. Can you please check?
The next version of Coolify will have a fix for the automated docker cleanup job.
Thank you for the reply.
Looks like disk still has space available:
filesystem Size Used Avail Use% Mounted on
udev 7.8G 0 7.8G 0% /dev
tmpfs 1.6G 2.8M 1.6G 1% /run
/dev/vda1 310G 118G 193G 39% /
tmpfs 7.9G 0 7.9G 0% /dev/shm
tmpfs 5.0M 0 5.0M 0% /run/lock
tmpfs 7.9G 0 7.9G 0% /sys/fs/cgroup
/dev/loop0 64M 64M 0 100% /snap/core20/2182
/dev/loop1 64M 64M 0 100% /snap/core20/2318
/dev/loop2 92M 92M 0 100% /snap/lxd/24061
/dev/vda15 105M 6.1M 99M 6% /boot/efi
/dev/loop3 40M 40M 0 100% /snap/snapd/21184
/dev/loop4 39M 39M 0 100% /snap/snapd/21759
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/7ba1faa39a6ac5dabf2f2c2fac5677a0ebe580ecb654a252d929be72203c6643/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/01a7d46c016c38c01670bb722f48cd33574267b2390d24a12de9b133cf97e951/merged
/dev/loop5 92M 92M 0 100% /snap/lxd/29619
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/3d772f546dd4eb603c40a7e4fbbecc21b289d33eb30ba3b354f18243fa8e37d3/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/0f3721b4f55feb89340f39c8ebf615803dd408f5154bd56ce0e070400191c76e/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/eb872e21f48069e8b92c28b8993f366603e47221b06f4a09a3ea399c1e8117db/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/f7a3ddbe0791bede45fd61ebd53434bd27309b530c46264030e103e2abd9edb7/merged
tmpfs 1.6G 0 1.6G 0% /run/user/0
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/8ad36a85765469dff0fd03aa03d2ed0cf734d29988b4bd70f3dda74c306484cc/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/d99ef68cebf6c624e124cdbf9a46cd57c7f3d31594549b311a22c10e54719067/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/3ac46da71a71e6853321ea6855ef95968d4e4d3f0f74a7cd2421c871fb8cce65/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/b3ffcfcb7e9e3a3cecfc5033d2a1ca8074417c3fe715b7d0e00791175f26d986/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/89ee4cada862b4dafd1d0660ba9b9961ccac150f83158de03012e6a3bab40bee/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/70f89420c1febca13586b70389f71e9763b49ea46e81774c63638ffda4ad4cbd/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/d179198c9ebb4ebd0518a2f34aa748ccb5482e03a0f67362de55bbda10a89b3b/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/7a3fae7d7e71a81514b8d0b41143f5ebcc07720a793a3ed127596aaa8d158aa7/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/dde7752463e0e6c5e7ccf54e45986731bcca45e154331d834bd7ce4589d9897e/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/bd069e4a582f5ca84e1f80a78c1d618f94d89c4f0c9d71e26c7157da393c1fd3/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/bec2253093f5a06312c8178af76f91dfbc06ddc818f7724c8a1470877d78048c/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/a8d339756f30d67dea2c803f23184ec49a7e443d0cad0e5a8308aa590ea69a36/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/7ad524f43d8f5a70ac531f58cfcba3193a54a5a6ef10a56be3164b0d32f562c0/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/483bd5941fbd5cd5ee1bbd9ef8d1b43911681ca2067ed3b771534db678c45a23/merged
overlay 310G 118G 193G 39% /var/lib/docker/overlay2/8138a2f20297451449cd5526650092f62a57a3a86fa66a1c1d16f43d2f6ce5d7/merged
I had a similar issue, and resolved it by adding swap space to my server. Turns out I was running out of memory.
Thank you @CodeCooler I've just tried adding swap space following this guide https://www.digitalocean.com/community/tutorials/how-to-add-swap-space-on-ubuntu-20-04 But no luck fixing this issue. I will restart the docker engine now as Andreas suggested to me.
I ended up having to completely remove docker & reinstall via coolify. Docker ended up being unresponsive and failing to start again.
Maybe this setting should be enabled by default @andrasbacsai? To prevent this issue from happening. I had about 100G of old docker files.
Hey, I am facing the same issue. Stuck at "load .dockerignore"
Using docker compose and beta 323
My disk is not full, and I have enabled force cleanup.
root@svr-muc1-doc-kubminion-001:~/.ssh# df -h
Filesystem Size Used Avail Use% Mounted on
udev 32G 0 32G 0% /dev
tmpfs 6.3G 4.6M 6.3G 1% /run
/dev/md0p1 900G 585G 270G 69% /
tmpfs 32G 0 32G 0% /dev/shm
tmpfs 5.0M 0 5.0M 0% /run/lock
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/008ce4afaa5e6539f7c318a160dae45f5531b41f43701b1bf50b89c1adde25f2/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/b71997adc631846ae5bb7dc4fb2f65aecf67c32bdc5e2bb988030dfc49476866/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/51cb076558dd6607ecd97843f1911484fa02d4f07a8cc6e904f8ce0df2ccfb01/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/4c775e2278b10048c17384989d91630ed3ca71848a0c906c46b87f8b3b8a8445/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/dc30412d311cdbe3c3f3328006d59759c96cc44bd7526731a0395814ed4be25f/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/a385b62e03a5ce784861e155eef6d0cb7aca1c73543d81601e0623d3ee07db82/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/f892e172a39197c16eb3b427ac134f4e7fd13545ecd080d497a2031d70195881/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/2dbfce0bafa80ad8b353f871fe675916c1b3afae3d2adcd3f6c1e943a697cbc6/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/89aa12c9c28f6c973cc5e25f8e9ccb5b9202e20533316eb45b4beac68a6d2855/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/10ebd3502d50beb5fc1276fa8e188f56e900fe9997433bfd35bd9e8dca3d1ddc/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/8e9c28f5cec932c8d905bfae829e77bc21e0f06051eb036f7bc9de9c2ceaf8c9/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/d46402ad3e37f94a82b03f5d8ffd30689a147729ff3afcc92c9a5f38dd574257/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/de3f82bd2eca7c465dafdd1be39d8d5e39c5d68e4ef0271d9c1ec25bfb1a2a46/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/9bd11899c6b3233cfd5ca1402d19eab81a4387dd99f355d80b1b202fcb563d76/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/af9afbc98b184757fd56523344a70870c6956873d91d36b02c1e017790b55bd2/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/e0c7dab4a1582977f4bd09e16f0bd25aeecc4e7018d31eaae269f09ae7c87f5d/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/d5fbf1ff4d4aafbaecdedc66d7412b942328d37741d754005d930f9f0ecaa059/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/a84b4ad8904d6fa547a454af7ada2798a04675213fe0317b2552037726c9a5e0/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/fb8ff8151d9ed0b341676c359f524e85a2ff8858a8cc3297550920921636c3b7/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/380e1c3376e671247489b1e41e4ddca2365ec411953e6e0b743f781457dcd211/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/7e06cca3f4ccf5a60e0f1f0c9a7d973fbdb85751d9a59eaf0cd7924c01160bdf/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/81964f5437672273409fe4a6b0d5c171ef6d4a43710237996683a32241d6637f/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/2d39d823e4ef0cd4dd909a9e5980ecc46e3df971da606acec303ef91d7e4e59d/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/f39df1611e8f5dc7c03805f4df43193415fdad78265d5bc98758af1fbb5d0b50/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/3464180aff1eeb24d4347dc498b9b8b4df12e08d09973232237430c16bc685d0/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/985bbc7386b19ca7dc710744a8a427ae6690be5b15da1f33d80d974a75ee43c3/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/cc419de097ed663687879370308fb02bde4801fc4f001b9038f80ce69606013d/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/80219c028a070688684fd08f3bed444ca3c8f44b42101f7394460bc270ff3f57/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/76088cec4fbee0678de0db2022383ae106339b6e98d6f8fc5aa26db2c5bc7f4b/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/fba9b26204dfc4789e5fe42253025da84d60629d2c48f9b4d8fd4898865460bb/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/3ba9e2a9ef3111253afe28a7bec9ee8552bdce002680376d80c3ef55e3182bb5/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/835f595b5592119ab5d9df1af2d7e5bf02ab304d0b231fde03a109fb851a0ffd/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/2e107347fac73dce535e16fd766f8895b12ee12c6fe03d0571e9a2151b5976b4/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/74b101e4b26b1e5ab4f5e85c58ef5f9f9a2124e6ab116f941263e530d5e3435b/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/6695d5b8a6d4f7d51834d5af56c192b9944cd1fce8f4dd75f5280b3dba574f17/merged
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/606acf0c18a4598a4bcb01a49a3d02aaca6a4aa06be280661c7f69e1c25c5f7a/merged
tmpfs 6.3G 4.0K 6.3G 1% /run/user/1002
tmpfs 6.3G 4.0K 6.3G 1% /run/user/1000
overlay 900G 585G 270G 69% /var/lib/docker/overlay2/95139e10f88bae42f3f3a79cf67d1c059de80b10753a9a765eeac6f0086b928a/merged
@adaichendt-tv1 I was talking to Andras about this issue on Stream some days ago and he mentioned that the only way to fix this right now is to restart docker.
oof, thanks for the tip!
this is a production ready environment, so restarting is not something I can do. It is our current docker machine which I added to coolify to orchestrate our containers from there in the future.
Is restarting a one-time fix or is there a possibility that I need to restart more frequently?
Do we have an understanding of the root cause? Could you perhaps link the timecode of the stream here for future documentation?
I am facing the same issue as @adaichendt-tv1. Is there a less radical way than @tomcru’s?
When I log on to the server and try docker builder prune -af the process seems to be running, but nothing happens. I stopped it after 30 minutes. Same with docker system df.
Is there a way to restart Docker without longer down-time and/or destroying anything?
Same here @AndreasFaust If your commands take forever, it's likely that your docker will also not restart - this happened to me and lead to having to reinstall everything, with downtime :(
Very worried it will happen again. @andrasbacsai did you possibly have a chance to look into this yet?
This is indeed worrying. I have multiple servers with multiple projects. To reinstall everything would be painful.
I was able to restart the docker service recently, which resolved the problem. A simple systemctl restart docker.service. It causes downtime tho as all the containers will be restarted.
Thank you, that sounds promising! So we assume Coolify has kind of a memory leak?
I'm not convinced this is a bug of coolify. It might just be something docker related. I don't have any insight into coolify tho, hence I was wondering about a timestamp of that stream where it was discussed to form a more complete picture of the root cause of this issue.
As I checked, the builder cache of Docker builds up to a certain point, even Docker cannot handle it. So you meed to restart Docker Engine, and then cleanup (docker builder prune) manually.
Probably Nixpacks fills up the cache this much. ☹️
Thank you, @andrasbacsai ! Fortunately this is my only project using Nixpacks, if that’s the source of the problem. Is there a way to prevent this? Like Coolify providing a build-cache-monitor and warning before things get out of control?
… or a routine, that deletes the build cache, when it reaches a certain threshold?
I don't think the automated docker cleanup job is working properly. I just manually ran docker image prune -a and removed 40 gigs of images that were piling up again.
Before this I also manually killed 20 containers that were just running endlessly - they weren't properly terminated by coolify (I believe they were automated preview deployments)
In the latest version, we changed the force cleanup to be automatic on every server by default.
Cloud will get the latest version soon.
Hi. I'm on the last beta (335) and 2 apps are stuck in "load build definition from Dockerfile". I have enough disk space. This is a production server so I can't restart docker. @andrasbacsai Any suggestion ? Thanks.
@GautierT There will be a button in the next version so you can run the Cleanup with one click https://github.com/coollabsio/coolify/pull/3545
Isn't it a cache size problem? I have the same problem, I use only 60% of my SSD (320GB available) Command like prune, docker system df hang I restarted docker then did a docker builder prune -a
@biboc Yes, the Coolify docker cleanup process also cleans up the builder cache docker builder prune -af.
The problem is that this command also hangs though. Running this manually did not work in the cases that I and others were reporting. Leaving a restart or reinstall as the only option left.
I think the issue is that the automatically running cleanup job from coolify was not working properly - leading to the huge cache - leading to no commands working.
Yes I heard there is a docker bug where only restarting helps not much we can do on our end.
But shouldn't the docker cache be cleared automatically by the job that coolify is running already? Or what could cause the cache to suddenly accumulate?
Andras mentioned a fix to the clean up job in one of his first replies. So this is possibly already fixed. Just haven't gotten an update on the cloud version yet since.
Yes it is fixed an cache will be cleaned by the coolify job if force cleanup is enabled or soon via a button click to run it manually. The problem is if build cache execeds a certain amount docker can not handel it anymore so the only fix is to restart docker. See the comment here https://github.com/coollabsio/coolify/issues/3150#issuecomment-2326573453
@peaklabs-dev : this is still happening on v4.0.0-beta.345...
It was okay during a few days and now it stuck on #2 [internal] load build definition from Dockerfile
Thanks.
I am starting to have those issues too, with v4.0.0-beta.345. I haven't encountered this issue before.
2024-Sep-26 18:07:41.995927
#1 [internal] load build definition from Dockerfile
2024-Sep-26 18:07:51.894936
#1 ...
2024-Sep-26 18:07:51.894936
2024-Sep-26 18:07:51.894936
#2 [internal] load .dockerignore