application-gateway-kubernetes-ingress
application-gateway-kubernetes-ingress copied to clipboard
application-gateway-kubernetes-ingress closing Websocket connecion from cluster
Describe the bug I have a Jenkins running in an AKS cluster with [application-gateway-kubernetes-ingress] as an ingress controller. This jenkins has a websocket connection with a Windows VM through websocket. Everytime the ingress controller updates the AGW config, this connection resets. Is this working as intended or a bug?
To Reproduce Steps to reproduce the behavior: 1.Open a websocket connection through an app deployed in the cluster to a external connected client. 2. Force the update of the application gateway trough the controller (making a new ingress, scaling a pod whose ingress is controlled etc) Ingress Controller details
-
Output of
kubectl describe pod <ingress controller> . Thepod name can be obtained by running helm list. -
`Name: application-gateway-kubernetes-ingress-azure-7587cb9c76-p5zxm Namespace: waf Priority: 0 Service Account: application-gateway-kubernetes-sa-ingress-azure Node: aks-npappspot-22452776-vmss000015/10.80.244.13 Start Time: Fri, 17 May 2024 11:00:17 +0200 Labels: app=ingress-azure azure.workload.identity/use=true pod-template-hash=7587cb9c76 release=application-gateway-kubernetes Annotations: checksum/config: a55e73766739d4f0b63fa6ccac7e8225874e27027784618c1b71e1bde902061c prometheus.io/port: 8123 prometheus.io/scrape: true Status: Running IP: 10.244.123.184 IPs: IP: 10.244.123.184 Controlled By: ReplicaSet/application-gateway-kubernetes-ingress-azure-7587cb9c76 Containers: ingress-azure: Container ID: containerd://09fe5735c8f0d1fbedec7bd4e8875bcc86fb860be39f58d8e4768499e37d3d98 Image: mcr.microsoft.com/azure-application-gateway/kubernetes-ingress:1.7.1 Image ID: mcr.microsoft.com/azure-application-gateway/kubernetes-ingress@sha256:91a6648b78c65f3b6858441589daabd72146d9a53e896c0e6abf501e870f9d9b Port:
Host Port: State: Running Started: Fri, 17 May 2024 11:00:46 +0200 Ready: True Restart Count: 0 Limits: cpu: 300m memory: 400Mi Requests: cpu: 5m memory: 250Mi Liveness: http-get http://:8123/health/alive delay=15s timeout=1s period=20s #success=1 #failure=3 Readiness: http-get http://:8123/health/ready delay=5s timeout=1s period=10s #success=1 #failure=3 Environment Variables from: application-gateway-kubernetes-cm-ingress-azure ConfigMap Optional: false Environment: AZURE_CLOUD_PROVIDER_LOCATION: /etc/appgw/azure.json AGIC_POD_NAME: application-gateway-kubernetes-ingress-azure-7587cb9c76-p5zxm (v1:metadata.name) AGIC_POD_NAMESPACE: waf (v1:metadata.namespace) AZURE_CLIENT_ID: 88fce944-e9be-44ca-89f1-2a0ed1dacb10 AZURE_TENANT_ID: 5cc6c66d-ffb2-469f-9385-cda840e57836 AZURE_FEDERATED_TOKEN_FILE: /var/run/secrets/azure/tokens/azure-identity-token AZURE_AUTHORITY_HOST: https://login.microsoftonline.com/ Mounts: /etc/appgw/ from azure (ro) /var/run/secrets/azure/tokens from azure-identity-token (ro) /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-9wbc4 (ro) Conditions: Type Status Initialized True Ready True ContainersReady True PodScheduled True Volumes: azure: Type: HostPath (bare host directory volume) Path: /etc/kubernetes/ HostPathType: Directory kube-api-access-9wbc4: Type: Projected (a volume that contains injected data from multiple sources) TokenExpirationSeconds: 3607 ConfigMapName: kube-root-ca.crt ConfigMapOptional: DownwardAPI: true azure-identity-token: Type: Projected (a volume that contains injected data from multiple sources) TokenExpirationSeconds: 3600 QoS Class: Burstable Node-Selectors: agentpool=npappspot Tolerations: app=app:NoSchedule kubernetes.azure.com/scalesetpriority=spot:NoSchedule node.kubernetes.io/memory-pressure:NoSchedule op=Exists node.kubernetes.io/not-ready:NoExecute op=Exists for 300s node.kubernetes.io/unreachable:NoExecute op=Exists for 300s -
Output of `kubectl logs
. -
I0520 01:54:30.183193 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:54:51.448200 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:54:54.260111 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:54:54.672570 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:54:55.659396 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:54:56.173071 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:55:00.184058 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:55:14.045324 1 reflector.go:255] Listing and watching *v1beta1.AzureApplicationGatewayRewrite from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 E0520 01:55:14.050055 1 reflector.go:138] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: Failed to watch *v1beta1.AzureApplicationGatewayRewrite: failed to list *v1beta1.AzureApplicationGatewayRewrite: the server could not find the requested resource (get azureapplicationgatewayrewrites.appgw.ingress.azure.io) I0520 01:55:21.450033 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:55:24.260218 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:55:24.673427 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:55:25.660096 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:55:26.173708 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:55:30.185434 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:55:44.417189 1 reflector.go:255] Listing and watching *v1beta1.AzureApplicationGatewayRewrite from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 E0520 01:55:44.422715 1 reflector.go:138] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: Failed to watch *v1beta1.AzureApplicationGatewayRewrite: failed to list *v1beta1.AzureApplicationGatewayRewrite: the server could not find the requested resource (get azureapplicationgatewayrewrites.appgw.ingress.azure.io) I0520 01:55:51.452209 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:55:54.260897 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:55:54.674444 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:55:55.661491 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:55:56.175157 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:56:00.186612 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:56:21.454010 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:56:24.261155 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:56:24.675272 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:56:25.662966 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync I0520 01:56:26.176297 1 reflector.go:381] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167: forcing resync
-
Any Azure support tickets associated with this issue.