k9s
k9s copied to clipboard
Watcher failed for apps/v1/deployments
I recently updated to v0.25.18
and I started to receive the following error when trying to change to deployments.
ERR component init failed for "Deployment" error="`list access denied for user on \"\":apps/v1/deployments"
ERR Watcher failed for apps/v1/deployments -- [list watch] access denied on resource "":"apps/v1/deployments"
I also got a similar message when trying to access pods or other resources. Furthermore, I verified my general k8s access with kubectl and it worked.
I downgraded now to v0.25.8
and do not experience the issue anymore.
mac: 12.1 (21C52) go version go1.17.5 darwin/amd64
This seems to be kubernetes version related, k9s v0.25.18
works still fine on a v1.19.14
cluster but fails with the watcher error on a v1.20.11
cluster
I went backwards version by version: v0.25.13
is the last version that works on the v1.20
cluster.
Hi,
Same issue for v1.21.1
K8s cluster.
I don't think anymore that this is cluster related, deleting .config/k9s/config.yml
can fix this.
This was tested with a gcloud
access config on GKE.
Hi,
I confirm the observation of @adrobisch.
By changing the environment variable KUBECONFIG, the problem occurs.
Then deleting the .config/k9s/config.yml
file temporarily fixes the problem until the next KUBECONFIG change.
Same issue with k8s v1.20.11
In my case, k9s generate .config/k9s/config.yml
and add default
to the active namespace, which however my config does not have the permission to access.
just fix the .config/k9s/config.yml
and everything just fine.
I found that you can launch k9s with a specific context that you have created with kubectl, and it should not only work, but change your .config/k9s/config.yml
active namespace
k9s --context $context
Deleting config didn't help in my case.
However setting namespace
explicitly in my context (as suggested in a separate issue) helped.
kubectl config set-context --current --namespace=default
Hi @bpoetzschke! Can I close this issue? You can create a new one with more details if you still facing this (or another) problem!
@slimus it seems the issue no longer persists for me with an upgraded k8s version and the latest version of the k9s. So I will close it.