argo-cd
argo-cd copied to clipboard
You may need to run `helm dependency build` to fetch missing dependencies:
Checklist:
- [x] I've searched in the docs and FAQ for my answer: https://bit.ly/argocd-faq.
- [x] I've included steps to reproduce the bug.
- [x] I've pasted the output of
argocd version
.
Describe the bug
Helm chart via kustomize failing to deploy with you may need to run helm dependency build
error.
kustomize build
locally was giving the same error until I run helm dep build
when chart.lock and .tgz files were added the kustomize build
no longer resulted in an error locally.
I've tried to add Chart.lock
to the repo, but that did not change the outcome.
Adding .tgz file to the repo did solve the issue, however I wonder if that is actually the intended behavior? Is it mandatory commit dependency .tgz files for it to work?
To Reproduce base/kustomization.yaml
apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
resources:
- ./namespace.yaml
helmGlobals:
chartHome: charts
helmCharts:
- name: external-secrets
releaseName: external-secrets
namespace: external-secrets
valuesFile: charts/external-secrets/values.yaml
base/namespace.yaml
apiVersion: v1
kind: Namespace
metadata:
name: external-secrets
overlays/dev/kustomization.yaml
apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
resources:
- ../../base
base/charts/external-secrets/Chart.yaml
apiVersion: v2
name: external-secrets
version: 0.6.1
dependencies:
- name: external-secrets
version: 0.6.1
repository: https://charts.external-secrets.io
base/charts/external-secrets/values.yaml
external-secrets:
serviceMonitor:
enabled: true
Added kustomize.buildOptions: --enable-helm
to argocd-cm
configmap
Added external-secrets repo config to argocd helm values.yaml
configs:
repositories:
external-secrets:
url: https://charts.external-secrets.io
name: external-secrets
type: helm
Expected behavior external-secrets getting deployed to k8s
Screenshots
Version
argocd: v2.5.3+0c7de21
BuildDate: 2022-11-28T17:11:59Z
GitCommit: 0c7de210ae66bf631cc4f27ee1b5cdc0d04c1c96
GitTreeState: clean
GoVersion: go1.18.8
Compiler: gc
Platform: linux/amd64
argocd-server: v2.5.3+0c7de21
BuildDate: 2022-11-28T16:51:33Z
GitCommit: 0c7de210ae66bf631cc4f27ee1b5cdc0d04c1c96
GitTreeState: clean
GoVersion: go1.18.8
Compiler: gc
Platform: linux/amd64
Kustomize Version: v4.5.7 2022-08-02T16:35:54Z
Helm Version: v3.10.1+g9f88ccb
Kubectl Version: v0.24.2
Jsonnet Version: v0.18.0
Logs
ComparisonError: rpc error: code = Unknown desc = `kustomize build .overlays/dev --enable-helm` failed exit status 1: Error: accumulating resources: accumulation err='accumulating resources from '../../base': '.base' must resolve to a file': recursed accumulation of path '.base': Error: An error occurred while checking for chart dependencies. You may need to run `helm dependency build` to fetch missing dependencies: found in Chart.yaml, but missing in charts/ directory: external-secrets : unable to run: 'helm template external-secrets --namespace external-secrets .base/charts/external-secrets --values /tmp/kustomize-helm-2338096147/external-secrets-kustomize-values.yaml' with env=[HELM_CONFIG_HOME=/tmp/kustomize-helm-2338096147/helm HELM_CACHE_HOME=/tmp/kustomize-helm-2338096147/helm/.cache HELM_DATA_HOME=/tmp/kustomize-helm-2338096147/helm/.data] (is 'helm' installed?)
I'm facing the same issue, even if purely Helm applications
I'm seeing a lot of those errors in repo-server. I use helm charts that have some dependency in Chart.yaml. The errors seems to happen only during a day when our dev team deploys apps via CICD however the error happens for Applications that are not even being deploy. So we run CICD for App1 and we see error for App2 that haven't been deployed for days. The commands we use in CICD are:
- argocd app diif
- argocd app set
- argocd app sync
- argocd app wait
Argo UI shows all the apps in "sync" state anyway, but from time to time this error happens during CICD on "argocd app sync" then whole pipeline crashes. Argocd 2.4.18 Worth to mention we use "soaps" as a wrapper for helm.
Repo server shows that it happens for 'helm template' command:
I'm seeing that same error message for a chart that is configured with a local dependency like that:
apiVersion: v2
name: myApp
description: myApp
type: application
version: 1.0.0
appVersion: 1.0.0
dependencies:
- name: service
version: 0.1.0
repository: file://../../helm-charts/service
my folder structure is:
project
-- apps
---- myApp
------ Chart.yaml
-- helm-charts
---- service
------ Chart.yaml
I'm getting:
rpc error: code = Unknown desc = Manifest generation error (cached): `helm template . --name-template ...
[...]
An error occurred while checking for chart dependencies.
You may need to run `helm dependency build` to fetch missing dependencies: found in Chart.yaml, but missing in charts/ directory: service
My versions:
{
"Version": "v2.5.7+e0ee345",
"BuildDate": "2023-01-18T02:23:39Z",
"GitCommit": "e0ee3458d0921ad636c5977d96873d18590ecf1a",
"GitTreeState": "clean",
"GoVersion": "go1.18.10",
"Compiler": "gc",
"Platform": "linux/amd64",
"KustomizeVersion": "v4.5.7 2022-08-02T16:35:54Z",
"HelmVersion": "v3.10.3+g835b733",
"KubectlVersion": "v0.24.2",
"JsonnetVersion": "v0.18.0"
}
I have the same thing working with:
{
"Version": "v2.5.4+86b2dde",
"BuildDate": "2022-12-06T19:46:25Z",
"GitCommit": "86b2dde8e4bf1187acd2b4294e94451cd104dad8",
"GitTreeState": "clean",
"GoVersion": "go1.18.8",
"Compiler": "gc",
"Platform": "linux/amd64",
"KustomizeVersion": "v4.5.7 2022-08-02T16:35:54Z",
"HelmVersion": "v3.10.1+g9f88ccb",
"KubectlVersion": "v0.24.2",
"JsonnetVersion": "v0.18.0"
}
Update:
I downgraded ArgoCD on my cluster to version v2.5.4+86b2dde
that I have on the other cluster where the same configuration works fine and after deploying my app I'm getting the same error 🤷🏻♂️ FFS!
@crenshaw-dev I'd appreciate if you could at least have a look and maybe suggest something 🙏🏻
Just guessing that in my case it happens because argocd is simply executing kustomize, and kustomize does not work without executing helm dependency build
prior.
I found this gist which shows how to write a custom plugin. Can't get it to work, but thought it might help others: https://github.com/argoproj/argocd-example-apps/blob/master/plugins/kustomized-helm/README.md
@AurimasNav
Yes, I think ArgoCD simply executes kustomize build --enable-helm
internally.
Does the following manifest not enough for your use case?
Or I'm afraid we should use ConfigMapPlugin as @audleman suggested.
apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
resources:
- ./namespace.yaml
helmCharts:
- name: external-secrets
repo: https://charts.external-secrets.io
releaseName: external-secrets
namespace: external-secrets
valuesFile: external-secrets-values.yaml
It seems like that kustomize build --enable-helm
runs helm pull --repo ${helmCharts.repo}
internally, and pulls helm charts locally, then templates it.
If we have a local chart directory, kustomize skips helm pull
. So you got the error you mentioned.
any updates? do we really need a plugin? getting the same errors with local helm charts wo kustomize
Facing the same issue while using app of apps pattern in pure helm.
I'm seeing that same error message for a chart that is configured with a local dependency like that:
apiVersion: v2 name: myApp description: myApp type: application version: 1.0.0 appVersion: 1.0.0 dependencies: - name: service version: 0.1.0 repository: file://../../helm-charts/service
Mine is similar and I face the same. Everything deploys just fine, but logs are flooded with this error.
Same, using argocd 2.10, just with helm.
same here when I tried to run gitlab helm chart on v2.11.3+3f344d5
Same here ! Upvote
I solved increasing the argo-cd timeout ... ARGOCD_EXEC_TIMEOUT
I solved increasing the argo-cd timeout ...
ARGOCD_EXEC_TIMEOUT
did it help? what value did you set?
please be aware of https://github.com/argoproj/argo-cd/issues/18122#issuecomment-2200231002 as the log message doesn't necessarily mean a real error, as right now it seems to be "by design" reported once when you do a hard refresh
. That doesn't mean that there might be another bug but in case you actually have your applications in-sync
and just see the error message in the logs, then this is probably just "bad noise".