cloud-builders-community
cloud-builders-community copied to clipboard
save_cache slow
Running save_cache takes ~40sec for a final archive that's ~175MB. This looks quite slow to me.
Here's my output:
Pulling image: gcr.io/XXX/save_cache
Using default tag: latest
latest: Pulling from XXX/save_cache
75f546e73d8b: Already exists
0f3bb76fc390: Already exists
3c2cba919283: Already exists
5a992b2091ae: Already exists
a03c90d5aa70: Already exists
1344b72882b8: Already exists
02f5215030de: Already exists
3f56bf5d454b: Already exists
Digest: sha256:b19c124096ecc3ae3aab533a329503489c46e7bfe6a2a87d805c1a12868b053f
Status: Downloaded newer image for gcr.io/XXX/save_cache:latest
gcr.io/upheld-hope-240615/save_cache:latest
Compressing cache to ./node_modules.tgz...
Uploading cache to Google Cloud Storage...
Copying file://./node_modules.tgz [Content-Type=application/x-tar]...
/ [0 files][ 0.0 B/175.5 MiB]
Operation completed over 1 objects/175.5 MiB.
Do you have any ideas who I can speed that up?
I'd recommend trying without gzip and seeing if you see a speed up. In my tests I found that the time to compress and upload a tgz was actually longer than the total time to simply tar and upload the larger file. The tradeoff is that you use more space in the bucket but... 🤷♂️