cluster-api-provider-aws
cluster-api-provider-aws copied to clipboard
Nil Pointer Panic in AWSManagedControlPlane Webhook
trafficstars
/kind bug
What steps did you take and what happened:
got this stack trace on a test recently, just want to document it with an issue and I'll make an aws.StringValue on this to make it safer.
2022/10/03 17:58:16 http: panic serving 127.0.0.1:52618: runtime error: invalid memory address or nil pointer dereference
goroutine 668 [running]:
net/http.(*conn).serve.func1()
/usr/local/go/src/net/http/server.go:1825 +0xbf
panic({0x17014a0, 0x2911fd0})
/usr/local/go/src/runtime/panic.go:844 +0x258
sigs.k8s.io/cluster-api-provider-aws/controlplane/eks/api/v1beta2.(*AWSManagedControlPlane).validateEKSAddons(0xc00152c900)
/home/prow/go/src/sigs.k8s.io/cluster-api-provider-aws/controlplane/eks/api/v1beta2/awsmanagedcontrolplane_webhook.go:240 +0x96
sigs.k8s.io/cluster-api-provider-aws/controlplane/eks/api/v1beta2.(*AWSManagedControlPlane).ValidateUpdate(0xc00152c900, {0x1c9a7a0?, 0xc00152cd80})
/home/prow/go/src/sigs.k8s.io/cluster-api-provider-aws/controlplane/eks/api/v1beta2/awsmanagedcontrolplane_webhook.go:124 +0x5f5
sigs.k8s.io/controller-runtime/pkg/webhook/admission.(*validatingHandler).Handle(_, {_, _}, {{{0xc0012ce720, 0x24}, {{0xc0009f4640, 0x1d}, {0xc001546d10, 0x7}, {0xc001212690, ...}}, ...}})
/home/prow/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/webhook/admission/validator.go:93 +0x7a9
sigs.k8s.io/controller-runtime/pkg/webhook/admission.(*Webhook).Handle(_, {_, _}, {{{0xc0012ce720, 0x24}, {{0xc0009f4640, 0x1d}, {0xc001546d10, 0x7}, {0xc001212690, ...}}, ...}})
/home/prow/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/webhook/admission/webhook.go:146 +0xa2
sigs.k8s.io/controller-runtime/pkg/webhook/admission.(*Webhook).ServeHTTP(0xc000e98880, {0x7f8cc0276a60?, 0xc0012c80f0}, 0xc001508800)
/home/prow/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/webhook/admission/http.go:98 +0xe90
github.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerInFlight.func1({0x7f8cc0276a60, 0xc0012c80f0}, 0x1ca8500?)
/home/prow/go/pkg/mod/github.com/prometheus/[email protected]/prometheus/promhttp/instrument_server.go:56 +0xd4
net/http.HandlerFunc.ServeHTTP(0x1ca8548?, {0x7f8cc0276a60?, 0xc0012c80f0?}, 0x80d140?)
/usr/local/go/src/net/http/server.go:2084 +0x2f
github.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerCounter.func1({0x1ca8548?, 0xc00018a540?}, 0xc001508800)
/home/prow/go/pkg/mod/github.com/prometheus/[email protected]/prometheus/promhttp/instrument_server.go:142 +0xb8
net/http.HandlerFunc.ServeHTTP(0x8049a8?, {0x1ca8548?, 0xc00018a540?}, 0xc001546cb1?)
/usr/local/go/src/net/http/server.go:2084 +0x2f
github.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerDuration.func2({0x1ca8548, 0xc00018a540}, 0xc001508800)
/home/prow/go/pkg/mod/github.com/prometheus/[email protected]/prometheus/promhttp/instrument_server.go:104 +0xbf
net/http.HandlerFunc.ServeHTTP(0x2dca002db?, {0x1ca8548?, 0xc00018a540?}, 0xc0000c60c0?)
/usr/local/go/src/net/http/server.go:2084 +0x2f
net/http.(*ServeMux).ServeHTTP(0xc0012fc8ec?, {0x1ca8548, 0xc00018a540}, 0xc001508800)
/usr/local/go/src/net/http/server.go:2462 +0x149
net/http.serverHandler.ServeHTTP({0x1c9a070?}, {0x1ca8548, 0xc00018a540}, 0xc001508800)
/usr/local/go/src/net/http/server.go:2916 +0x43b
net/http.(*conn).serve(0xc0000c01e0, {0x1ca92e8, 0xc000a013b0})
/usr/local/go/src/net/http/server.go:1966 +0x5d7
created by net/http.(*Server).Serve
/usr/local/go/src/net/http/server.go:3071 +0x4db
What did you expect to happen:
Anything else you would like to add: [Miscellaneous information that will assist in solving the issue.]
Environment:
- Cluster-api-provider-aws version:
- Kubernetes version: (use
kubectl version): - OS (e.g. from
/etc/os-release):
/triage accepted
Nice find. :)
/assign @luthermonson