application-gateway-kubernetes-ingress icon indicating copy to clipboard operation
application-gateway-kubernetes-ingress copied to clipboard

AGIC not finding prohibited targets

Open pbloigu opened this issue 3 years ago • 5 comments

Describe the bug AGIC brownfield deployment, version 1.2.0, one app gw, 4 clusters. I have observed in one occasion that in one of the clusters, AGIC didn't find deployed prohibited targets thus erasing the app gw configuration, which obviously resulted in the other 3 clusters not being accessible anymore.

To Reproduce Steps to reproduce the behavior: Apparently random occurrence.

Additional details: Unfortunately the pod instance is gone already, so I don't have direct access to the logs anymore, but I can dig something up from Azure Log Analytics. As far as I can tell, it reads the app gw config alright, but then there is this:

  1 mutate_app_gateway.go:85] Brownfield Deployment is enabled, but AGIC did not find any AzureProhibitedTarget CRDs  Disabling brownfield deployment feature.

The only real error I see in the logs is this, not sure if relevant:

1 client.go:227] Error getting route table '/subscriptions/<redacted>/resourceGroups/<redacted>/providers/Microsoft.Network/routeTables/<redacted>'

I can try to dig up more of the logs from Analytics, just let me know what to look for. The 4 cluster setup has been running fine for a few months already and some of the clusters get stuff deployed in them almost on a daily basis, and this is the first occurrence of this kind. The prohibited targets have not been changed at all during this time.

pbloigu avatar Feb 22 '21 10:02 pbloigu

Okay, now this is starting to unfold. The AGIC controller had crashed just prior to the above issue, apparently due to not being able to connect to the K8s API:

2021-02-20T16:03:10.51033398Z stderr F W0220 16:03:10.510177       1 reflector.go:402] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1.Pod ended with: very short watch: pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received
2021-02-20T16:03:10.510846577Z stderr F W0220 16:03:10.510741       1 reflector.go:402] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1beta1.Ingress ended with: very short watch: pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received
2021-02-20T16:03:10.511435674Z stderr F W0220 16:03:10.511352       1 reflector.go:402] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1.AzureIngressProhibitedTarget ended with: very short watch: pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received
2021-02-20T16:03:10.511474573Z stderr F W0220 16:03:10.511412       1 reflector.go:402] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1.Endpoints ended with: very short watch: pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received
2021-02-20T16:03:10.523912299Z stderr F W0220 16:03:10.523609       1 reflector.go:402] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1.Secret ended with: very short watch: pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received
2021-02-20T16:03:10.526017586Z stderr F W0220 16:03:10.525811       1 reflector.go:402] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1.Service ended with: very short watch: pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received
2021-02-20T16:03:11.527451486Z stderr F E0220 16:03:11.527093       1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.AzureIngressProhibitedTarget: Get "https://10.0.0.1:443/apis/appgw.ingress.k8s.io/v1/azureingressprohibitedtargets?resourceVersion=50084732": dial tcp 10.0.0.1:443: connect: connection refused
2021-02-20T16:03:11.527489086Z stderr F E0220 16:03:11.527207       1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Endpoints: Get "https://10.0.0.1:443/api/v1/endpoints?resourceVersion=50345711": dial tcp 10.0.0.1:443: connect: connection refused
2021-02-20T16:03:11.527892384Z stderr F E0220 16:03:11.527525       1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Pod: Get "https://10.0.0.1:443/api/v1/pods?resourceVersion=50345714": dial tcp 10.0.0.1:443: connect: connection refused
2021-02-20T16:03:11.528172782Z stderr F E0220 16:03:11.528042       1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1beta1.Ingress: Get "https://10.0.0.1:443/apis/extensions/v1beta1/ingresses?resourceVersion=49836304": dial tcp 10.0.0.1:443: connect: connection refused
2021-02-20T16:03:11.52850768Z stderr F E0220 16:03:11.528347       1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.AzureIngressProhibitedTarget: Get "https://10.0.0.1:443/apis/appgw.ingress.k8s.io/v1/azureingressprohibitedtargets?resourceVersion=50084732": dial tcp 10.0.0.1:443: connect: connection refused
2021-02-20T16:03:11.564027467Z stderr F E0220 16:03:11.563805       1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Service: Get "https://10.0.0.1:443/api/v1/services?resourceVersion=50084732": dial tcp 10.0.0.1:443: connect: connection refused
2021-02-20T16:03:11.564069467Z stderr F E0220 16:03:11.564002       1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Secret: Get "https://10.0.0.1:443/api/v1/secrets?resourceVersion=50084732": dial tcp 10.0.0.1:443: connect: connection refused
2021-02-20T16:03:17.416138705Z stderr F E0220 16:03:17.415985       1 runtime.go:78] Observed a panic: &reflect.ValueError{Method:"reflect.Value.Elem", Kind:0x19} (reflect: call of reflect.Value.Elem on struct Value)
2021-02-20T16:03:17.416223405Z stderr F goroutine 112 [running]:
2021-02-20T16:03:17.416231504Z stderr F k8s.io/apimachinery/pkg/util/runtime.logPanic(0x151a580, 0xc000ab6140)
2021-02-20T16:03:17.416236404Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/runtime/runtime.go:74 +0xa3
2021-02-20T16:03:17.416241704Z stderr F k8s.io/apimachinery/pkg/util/runtime.HandleCrash(0x0, 0x0, 0x0)
2021-02-20T16:03:17.416245904Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/runtime/runtime.go:48 +0x82
2021-02-20T16:03:17.416250504Z stderr F panic(0x151a580, 0xc000ab6140)
2021-02-20T16:03:17.416254604Z stderr F 	/opt/hostedtoolcache/go/1.14.4/x64/src/runtime/panic.go:969 +0x166
2021-02-20T16:03:17.416275704Z stderr F reflect.Value.Elem(0x15b7ba0, 0xc00046f260, 0x99, 0xc00066c180, 0x4, 0xc0006a9de0)
2021-02-20T16:03:17.416280704Z stderr F 	/opt/hostedtoolcache/go/1.14.4/x64/src/reflect/value.go:820 +0x1a4
2021-02-20T16:03:17.416285304Z stderr F github.com/Azure/application-gateway-kubernetes-ingress/pkg/k8scontext.getNamespace(0x15b7ba0, 0xc00046f260, 0x3, 0x0)
2021-02-20T16:03:17.416289904Z stderr F 	/home/vsts/work/1/s/gopath/src/github.com/Azure/application-gateway-kubernetes-ingress/pkg/k8scontext/handlers.go:66 +0xa2
2021-02-20T16:03:17.416294504Z stderr F github.com/Azure/application-gateway-kubernetes-ingress/pkg/k8scontext.handlers.deleteFunc(0xc000396870, 0x15b7ba0, 0xc00046f260)
2021-02-20T16:03:17.416298804Z stderr F 	/home/vsts/work/1/s/gopath/src/github.com/Azure/application-gateway-kubernetes-ingress/pkg/k8scontext/handlers.go:50 +0x39
2021-02-20T16:03:17.416303604Z stderr F k8s.io/client-go/tools/cache.ResourceEventHandlerFuncs.OnDelete(...)
2021-02-20T16:03:17.416307504Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/tools/cache/controller.go:235
2021-02-20T16:03:17.416311504Z stderr F k8s.io/client-go/tools/cache.(*processorListener).run.func1()
2021-02-20T16:03:17.416315704Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/tools/cache/shared_informer.go:748 +0x187
2021-02-20T16:03:17.416320104Z stderr F k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc000650760)
2021-02-20T16:03:17.416323904Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:155 +0x5f
2021-02-20T16:03:17.416328104Z stderr F k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0006a9f60, 0x1943840, 0xc0002a1a70, 0xc000010301, 0xc000649b00)
2021-02-20T16:03:17.416332104Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:156 +0xa3
2021-02-20T16:03:17.416353404Z stderr F k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000650760, 0x3b9aca00, 0x0, 0x1, 0xc000649b00)
2021-02-20T16:03:17.416358504Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:133 +0xe2
2021-02-20T16:03:17.416362504Z stderr F k8s.io/apimachinery/pkg/util/wait.Until(...)
2021-02-20T16:03:17.416366304Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:90
2021-02-20T16:03:17.416370104Z stderr F k8s.io/client-go/tools/cache.(*processorListener).run(0xc000381900)
2021-02-20T16:03:17.416374004Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/tools/cache/shared_informer.go:740 +0x95
2021-02-20T16:03:17.416377504Z stderr F k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1(0xc00030ddb0, 0xc00038ccc0)
2021-02-20T16:03:17.416410803Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:73 +0x51
2021-02-20T16:03:17.416414503Z stderr F created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
2021-02-20T16:03:17.416418003Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:71 +0x62
2021-02-20T16:03:17.418525591Z stderr F panic: reflect: call of reflect.Value.Elem on struct Value [recovered]
2021-02-20T16:03:17.418549491Z stderr F 	panic: reflect: call of reflect.Value.Elem on struct Value
2021-02-20T16:03:17.418554391Z stderr F 
2021-02-20T16:03:17.41857889Z stderr F goroutine 112 [running]:
2021-02-20T16:03:17.41858629Z stderr F k8s.io/apimachinery/pkg/util/runtime.HandleCrash(0x0, 0x0, 0x0)
2021-02-20T16:03:17.41859009Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/runtime/runtime.go:55 +0x105
2021-02-20T16:03:17.41859439Z stderr F panic(0x151a580, 0xc000ab6140)
2021-02-20T16:03:17.41859829Z stderr F 	/opt/hostedtoolcache/go/1.14.4/x64/src/runtime/panic.go:969 +0x166
2021-02-20T16:03:17.41860179Z stderr F reflect.Value.Elem(0x15b7ba0, 0xc00046f260, 0x99, 0xc00066c180, 0x4, 0xc0006a9de0)
2021-02-20T16:03:17.41860529Z stderr F 	/opt/hostedtoolcache/go/1.14.4/x64/src/reflect/value.go:820 +0x1a4
2021-02-20T16:03:17.41860939Z stderr F github.com/Azure/application-gateway-kubernetes-ingress/pkg/k8scontext.getNamespace(0x15b7ba0, 0xc00046f260, 0x3, 0x0)
2021-02-20T16:03:17.41861389Z stderr F 	/home/vsts/work/1/s/gopath/src/github.com/Azure/application-gateway-kubernetes-ingress/pkg/k8scontext/handlers.go:66 +0xa2
2021-02-20T16:03:17.41863609Z stderr F github.com/Azure/application-gateway-kubernetes-ingress/pkg/k8scontext.handlers.deleteFunc(0xc000396870, 0x15b7ba0, 0xc00046f260)
2021-02-20T16:03:17.41863999Z stderr F 	/home/vsts/work/1/s/gopath/src/github.com/Azure/application-gateway-kubernetes-ingress/pkg/k8scontext/handlers.go:50 +0x39
2021-02-20T16:03:17.41867259Z stderr F k8s.io/client-go/tools/cache.ResourceEventHandlerFuncs.OnDelete(...)
2021-02-20T16:03:17.41869449Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/tools/cache/controller.go:235
2021-02-20T16:03:17.41869849Z stderr F k8s.io/client-go/tools/cache.(*processorListener).run.func1()
2021-02-20T16:03:17.41870209Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/tools/cache/shared_informer.go:748 +0x187
2021-02-20T16:03:17.41872279Z stderr F k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc000650760)
2021-02-20T16:03:17.418730289Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:155 +0x5f
2021-02-20T16:03:17.418758289Z stderr F k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0006a9f60, 0x1943840, 0xc0002a1a70, 0xc000010301, 0xc000649b00)
2021-02-20T16:03:17.418761789Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:156 +0xa3
2021-02-20T16:03:17.418765289Z stderr F k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000650760, 0x3b9aca00, 0x0, 0x1, 0xc000649b00)
2021-02-20T16:03:17.418769089Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:133 +0xe2
2021-02-20T16:03:17.418772589Z stderr F k8s.io/apimachinery/pkg/util/wait.Until(...)
2021-02-20T16:03:17.418776189Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:90
2021-02-20T16:03:17.418779889Z stderr F k8s.io/client-go/tools/cache.(*processorListener).run(0xc000381900)
2021-02-20T16:03:17.418783189Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/tools/cache/shared_informer.go:740 +0x95
2021-02-20T16:03:17.418786689Z stderr F k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1(0xc00030ddb0, 0xc00038ccc0)
2021-02-20T16:03:17.418797989Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:73 +0x51
2021-02-20T16:03:17.418801889Z stderr F created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
2021-02-20T16:03:17.418805489Z stderr F 	/home/vsts/work/1/s/gopath/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:71 +0x62

It then restarted and wasn't able to see the prohibited targets anymore. I have observed the same K8s API connectivity issue once before in another cluster in which Linkerd wasn't able to connect to the API wreaking havoc in the cluster. Now, that aside, I still don't understand why AGIC didn't find the prohibited targest after the restart. Also, I would appreciate it didn't just flat out die if the K8s API isn't available, as it does seem to occur from time to time.

pbloigu avatar Feb 22 '21 12:02 pbloigu

Having similar issue. After the AKS API not being available, 10.2.1.1:443 connection refused, AGIC seems corrupted and cannot recover. Only solution is to delete the AGIC pod then it restart and applies the latest/greatest configuration. Any idea or ticket related to this issue to follow ? Running version 1.3.0

] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1.AzureIngressProhibitedTarget ended with
: very short watch: pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received
W0406 09:35:08.519753       1 reflector.go:402] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1.Endpoints ended with: very short watch:
 pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received

 0.0-20200326020446-6240434e1ad6/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received
E0406 09:35:08.522969       1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.AzureIngressProhibitedTarget: Get
 "https://10.2.1.1:443/apis/appgw.ingress.k8s.io/v1/azureingressprohibitedtargets?resourceVersion=4730": dial tcp 10.2.1.1:443: connect: connection refused

rbickel avatar Apr 06 '21 12:04 rbickel

Noticed same issue here. Prohibitedtarget CRD isntalled but is not always detected

aelmanaa avatar Jun 24 '21 09:06 aelmanaa

I'm having a similar issue here - AGIC seems to run OK, then after some time I notice no backends. The logs show:

I0716 15:33:44.816220 1 mutate_app_gateway.go:164] cache: Config has NOT changed! No need to connect to ARM. I0716 15:33:44.816243 1 controller.go:151] Completed last event loop run in: 92.777106ms I0716 15:34:07.411047 1 backendhttpsettings.go:89] Created backend http settings bp-istio-system-istio-ingressgateway-80-8080-istio-agic-ingress for ingress istio-system/istio-agic-ingress and service istio-system/istio-ingressgateway I0716 15:34:07.417542 1 mutate_app_gateway.go:164] cache: Config has NOT changed! No need to connect to ARM. I0716 15:34:07.417571 1 controller.go:151] Completed last event loop run in: 57.7221ms W0716 16:07:00.526357 1 reflector.go:402] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1.Secret ended with: very short watch: pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received W0716 16:07:00.526904 1 reflector.go:402] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1.Endpoints ended with: very short watch: pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received W0716 16:07:00.527013 1 reflector.go:402] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1beta1.Ingress ended with: very short watch: pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received W0716 16:07:00.527442 1 reflector.go:402] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1.Pod ended with: very short watch: pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received W0716 16:07:00.527995 1 reflector.go:402] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: watch of *v1.Service ended with: very short watch: pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Unexpected watch close - watch lasted less than a second and no items received E0716 16:07:01.532652 1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1beta1.Ingress: Get "https://10.0.0.1:443/apis/extensions/v1beta1/ingresses?resourceVersion=7428392": dial tcp 10.0.0.1:443: connect: connection refused E0716 16:07:01.534851 1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Service: Get "https://10.0.0.1:443/api/v1/services?resourceVersion=7540291": dial tcp 10.0.0.1:443: connect: connection refused E0716 16:07:01.586011 1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Endpoints: Get "https://10.0.0.1:443/api/v1/endpoints?resourceVersion=7546350": dial tcp 10.0.0.1:443: connect: connection refused E0716 16:07:01.587289 1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Endpoints: Get "https://10.0.0.1:443/api/v1/endpoints?resourceVersion=7546350": dial tcp 10.0.0.1:443: connect: connection refused E0716 16:07:01.592249 1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Secret: Get "https://10.0.0.1:443/api/v1/secrets?resourceVersion=7540283": dial tcp 10.0.0.1:443: connect: connection refused E0716 16:07:01.592294 1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Pod: Get "https://10.0.0.1:443/api/v1/pods?resourceVersion=7546258": dial tcp 10.0.0.1:443: connect: connection refused E0716 16:07:01.593512 1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Secret: Get "https://10.0.0.1:443/api/v1/secrets?resourceVersion=7540283": dial tcp 10.0.0.1:443: connect: connection refused E0716 16:07:03.799569 1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Service: Get "https://10.0.0.1:443/api/v1/services?resourceVersion=7540291": dial tcp 10.0.0.1:443: connect: connection refused E0716 16:07:04.084258 1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1beta1.Ingress: Get "https://10.0.0.1:443/apis/extensions/v1beta1/ingresses?resourceVersion=7428392": dial tcp 10.0.0.1:443: connect: connection refused E0716 16:07:04.148251 1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Pod: Get "https://10.0.0.1:443/api/v1/pods?resourceVersion=7546258": dial tcp 10.0.0.1:443: connect: connection refused E0716 16:07:05.924588 1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Secret: Get "https://10.0.0.1:443/api/v1/secrets?resourceVersion=7540283": dial tcp 10.0.0.1:443: connect: connection refused E0716 16:07:06.836985 1 reflector.go:178] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:125: Failed to list *v1.Endpoints: Get "https://10.0.0.1:443/api/v1/endpoints?resourceVersion=7546350": dial tcp 10.0.0.1:443: connect: connection refused

I am using Istio Ingress Gateway for the backend, and AGIC v 1.4.0. After restarting AGIC it is then OK, it immediately restarts and creates the rules, and stays working - for some amount of time.

It seems if it can't connect once it just gives up - it seems it doesn't attempt to retry, and removes all the rules - this isn't so helpful for an environment we are trying to get nearer to production.

adamcarter81 avatar Jul 19 '21 09:07 adamcarter81

I have just been hit with this again - it seems to echo @pbloigu. Symptom: manually created rules on App Gateway had disappeared (they have matching ProhibitedTarget CRDs). Investigation showed AGIC crashed and restarted, when it restarted it did not find the ProhibitedTarget CRDS:

Crash log:

`E0809 03:47:30.375902 1 runtime.go:78] Observed a panic: &reflect.ValueError{Method:"reflect.Value.Elem", Kind:0x19} (reflect: call of reflect.Value.Elem on struct Value) goroutine 140 [running]: k8s.io/apimachinery/pkg/util/runtime.logPanic({0x16b3140, 0xc000b1a8a0}) /go/pkg/mod/k8s.io/[email protected]/pkg/util/runtime/runtime.go:74 +0x85 k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x18a1660}) /go/pkg/mod/k8s.io/[email protected]/pkg/util/runtime/runtime.go:48 +0x75 panic({0x16b3140, 0xc000b1a8a0}) /usr/local/go/src/runtime/panic.go:1038 +0x215 reflect.Value.Elem({0x1773e40, 0xc0012e2a00, 0x4388d6}) /usr/local/go/src/reflect/value.go:1178 +0x15a github.com/Azure/application-gateway-kubernetes-ingress/pkg/k8scontext.getNamespace({0x1773e40, 0xc0012e2a00}) /azure/pkg/k8scontext/handlers.go:66 +0xa5 github.com/Azure/application-gateway-kubernetes-ingress/pkg/k8scontext.handlers.deleteFunc({0xc001133ae8}, {0x1773e40, 0xc0012e2a00}) /azure/pkg/k8scontext/handlers.go:50 +0x32 k8s.io/client-go/tools/cache.ResourceEventHandlerFuncs.OnDelete(...) /go/pkg/mod/k8s.io/[email protected]/tools/cache/controller.go:245 k8s.io/client-go/tools/cache.(*processorListener).run.func1() /go/pkg/mod/k8s.io/[email protected]/tools/cache/shared_informer.go:779 +0xdf k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x7fce287ab978) /go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:155 +0x67 k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000ae0738, {0x1afc440, 0xc0006200f0}, 0x1, 0xc000aca240) /go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:156 +0xb6 k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x0, 0x3b9aca00, 0x0, 0x0, 0xc000ae0788) /go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:133 +0x89 k8s.io/apimachinery/pkg/util/wait.Until(...) /go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:90 k8s.io/client-go/tools/cache.(*processorListener).run(0xc0001de700) /go/pkg/mod/k8s.io/[email protected]/tools/cache/shared_informer.go:771 +0x6b k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1() /go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:73 +0x5a created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start /go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:71 +0x88 panic: reflect: call of reflect.Value.Elem on struct Value [recovered] panic: reflect: call of reflect.Value.Elem on struct Value

goroutine 140 [running]: k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x18a1660}) /go/pkg/mod/k8s.io/[email protected]/pkg/util/runtime/runtime.go:55 +0xd8 panic({0x16b3140, 0xc000b1a8a0}) /usr/local/go/src/runtime/panic.go:1038 +0x215 reflect.Value.Elem({0x1773e40, 0xc0012e2a00, 0x4388d6}) /usr/local/go/src/reflect/value.go:1178 +0x15a github.com/Azure/application-gateway-kubernetes-ingress/pkg/k8scontext.getNamespace({0x1773e40, 0xc0012e2a00}) /azure/pkg/k8scontext/handlers.go:66 +0xa5 github.com/Azure/application-gateway-kubernetes-ingress/pkg/k8scontext.handlers.deleteFunc({0xc001133ae8}, {0x1773e40, 0xc0012e2a00}) /azure/pkg/k8scontext/handlers.go:50 +0x32 k8s.io/client-go/tools/cache.ResourceEventHandlerFuncs.OnDelete(...) /go/pkg/mod/k8s.io/[email protected]/tools/cache/controller.go:245 k8s.io/client-go/tools/cache.(*processorListener).run.func1() /go/pkg/mod/k8s.io/[email protected]/tools/cache/shared_informer.go:779 +0xdf k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x7fce287ab978) /go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:155 +0x67 k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000ae0738, {0x1afc440, 0xc0006200f0}, 0x1, 0xc000aca240) /go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:156 +0xb6 k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x0, 0x3b9aca00, 0x0, 0x0, 0xc000ae0788) /go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:133 +0x89 k8s.io/apimachinery/pkg/util/wait.Until(...) /go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:90 k8s.io/client-go/tools/cache.(*processorListener).run(0xc0001de700) /go/pkg/mod/k8s.io/[email protected]/tools/cache/shared_informer.go:771 +0x6b k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1() /go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:73 +0x5a created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start /go/pkg/mod/k8s.io/[email protected]/pkg/util/wait/wait.go:71 +0x88`

Restart Log: I0809 03:47:56.497430 1 utils.go:114] Using verbosity level 4 from environment variable APPGW_VERBOSITY_LEVEL I0809 03:47:56.538590 1 supported_apiversion.go:70] server version is: 1.22.6 I0809 03:47:56.561713 1 environment.go:294] KUBERNETES_WATCHNAMESPACE is not set. Watching all available namespaces. I0809 03:47:56.561734 1 main.go:118] Using User Agent Suffix='' when communicating with ARM I0809 03:47:56.561809 1 main.go:137] Application Gateway Details: Subscription="" Resource Group="rg-prod" Name="agw-prod" I0809 03:47:56.561816 1 auth.go:53] Creating authorizer from Azure Managed Service Identity I0809 03:47:56.561833 1 httpserver.go:57] Starting API Server on :8123 I0809 03:47:56.876924 1 main.go:184] Ingress Controller will observe all namespaces. I0809 03:47:57.015720 1 context.go:167] k8s context run started I0809 03:47:57.015748 1 context.go:230] Waiting for initial cache sync I0809 03:47:57.015806 1 reflector.go:219] Starting reflector *v1.Secret (30s) from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.015818 1 reflector.go:255] Listing and watching *v1.Secret from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.015814 1 reflector.go:219] Starting reflector *v1.Service (30s) from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.015823 1 reflector.go:219] Starting reflector *v1.Endpoints (30s) from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.015841 1 reflector.go:255] Listing and watching *v1.Endpoints from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.015827 1 reflector.go:255] Listing and watching *v1.Service from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.015829 1 reflector.go:219] Starting reflector *v1.AzureIngressProhibitedTarget (30s) from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.015901 1 reflector.go:255] Listing and watching *v1.AzureIngressProhibitedTarget from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.015850 1 reflector.go:219] Starting reflector *v1.Pod (30s) from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.015967 1 reflector.go:255] Listing and watching *v1.Pod from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.015872 1 reflector.go:219] Starting reflector *v1.Ingress (30s) from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.016005 1 reflector.go:255] Listing and watching *v1.Ingress from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.015870 1 reflector.go:219] Starting reflector *v1.IngressClass (30s) from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.016045 1 reflector.go:255] Listing and watching *v1.IngressClass from pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:167 I0809 03:47:57.215818 1 shared_informer.go:270] caches populated I0809 03:47:57.215843 1 context.go:243] Initial cache sync done I0809 03:47:57.215849 1 context.go:244] k8s context run finished I0809 03:47:57.215892 1 worker.go:39] Worker started **W0809 03:47:57.618260 1 mutate_app_gateway.go:85] Brownfield Deployment is enabled, but AGIC did not find any AzureProhibitedTarget CRDs; Disabling brownfield deployment feature.**

It seems we could do with a softer way to restart after this crash

adamcarter81 avatar Aug 11 '22 09:08 adamcarter81