[octavia-ingress-controller] Could not retrieve TLS certificate
Is this a BUG REPORT or FEATURE REQUEST?: /kind bug
What happened: The octavia-ingress-controller fails to create the listener when trying to enable tls encryption.
What you expected to happen: A listener with TLS termination to be created.
How to reproduce it:
Follow the documentation on setting up the ingress controller. Here is my configuration for octavia-ingress-controller:
---
kind: ConfigMap
apiVersion: v1
metadata:
name: octavia-ingress-controller-config
namespace: kube-system
data:
config: |
cluster-name: yfr5nrxt47
openstack:
auth-url: <ID-endpoint>
domain-name: domain
application-credential-name: nnn
application-credential-id: iii
application-credential-secret: sss
project-id: ID
region: regionOne
octavia:
subnet-id: UUID
floating-network-id: UUID
manage-security-groups: false
Then follow: enable tls encryption.
Anything else we need to know?: HTTP ingress in this setup worked as expected.
Logs of the octavia-ingress-controller:
INFO [2023-04-14T08:43:47Z] creating ingress ingress=default/test-octavia-ingress
INFO [2023-04-14T08:43:50Z] creating loadbalancer ingress=default/test-octavia-ingress lbID=30e6bbad-938e-4c77-9173-d62e530ee70a lbName=kube_ingress_yfr5nrxt47_default_test-octavia-ingress
INFO [2023-04-14T08:44:42Z] secret created in Barbican ingress=default/test-octavia-ingress lbID=30e6bbad-938e-4c77-9173-d62e530ee70a secretName=kube_ingress_yfr5nrxt47_default_test-octavia-ingress_tls-secret secretRef="<KM-endpoint>/v1/secrets/f7d1242f-cff0-4cf2-8978-9fa01ab64816"
INFO [2023-04-14T08:44:42Z] creating listener lbID=30e6bbad-938e-4c77-9173-d62e530ee70a listenerName=kube_ingress_yfr5nrxt47_default_test-octavia-ingress
E0414 08:44:45.239812 1 controller.go:521] failed to create openstack resources for ingress default/test-octavia-ingress: error creating listener: Bad request with: [POST <LB-endoint>/v2.0/lbaas/listeners], error message: {"faultcode": "Client", "faultstring": "Could not retrieve certificate: ['<KM-endpoint>/v1/secrets/f7d1242f-cff0-4cf2-8978-9fa01ab64816', '<KM-endpoint>/v1/secrets/f7d1242f-cff0-4cf2-8978-9fa01ab64816']", "debuginfo": null}
I0414 08:44:45.239950 1 event.go:285] Event(v1.ObjectReference{Kind:"Ingress", Namespace:"default", Name:"test-octavia-ingress", UID:"5c479f27-3e9c-4d1b-813f-77991d91ee52", APIVersion:"networking.k8s.io/v1", ResourceVersion:"1534583", FieldPath:""}): type: 'Warning' reason: 'Failed' Failed to create openstack resources for ingress default/test-octavia-ingress: error creating listener: Bad request with: [POST <LB-endoint>/v2.0/lbaas/listeners], error message: {"faultcode": "Client", "faultstring": "Could not retrieve certificate: ['<KM-endpoint>/v1/secrets/f7d1242f-cff0-4cf2-8978-9fa01ab64816', '<KM-endpoint>/v1/secrets/f7d1242f-cff0-4cf2-8978-9fa01ab64816']", "debuginfo": null}
Note that I can retrieve the secret using the application credential, that is with
export OS_AUTH_TYPE=v3applicationcredential
export OS_AUTH_URL=<ID-endpoint>
export OS_IDENTITY_API_VERSION=3
export OS_REGION_NAME=regionOne
export OS_INTERFACE=public
export OS_APPLICATION_CREDENTIAL_ID=iii
export OS_APPLICATION_CREDENTIAL_SECRET=sss
I can do this:
openstack secret get <KM-endpoint>/v1/secrets/f7d1242f-cff0-4cf2-8978-9fa01ab64816 \
--decrypt \
--payload_content_type 'application/octet-stream' \
-c Payload \
-f value
Environment:
- openstack-cloud-controller-manager version: 1.25.5
- barbican-kms-plugin version: 1.25.5
- octavia-ingress-controller version: 1.25.5
- OpenStack version: Rocky
- Kubernetes version: 1.25.8
I'd like to figure out where this problem is coming from, but could use some help on where to look :confused:
The Kubernetes project currently lacks enough contributors to adequately respond to all issues.
This bot triages un-triaged issues according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied - After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied - After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closed
You can:
- Mark this issue as fresh with
/remove-lifecycle stale - Close this issue with
/close - Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle stale
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues.
This bot triages un-triaged issues according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied - After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied - After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closed
You can:
- Mark this issue as fresh with
/remove-lifecycle rotten - Close this issue with
/close - Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle rotten
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs.
This bot triages issues according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied - After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied - After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closed
You can:
- Reopen this issue with
/reopen - Mark this issue as fresh with
/remove-lifecycle rotten - Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/close not-planned
@k8s-triage-robot: Closing this issue, marking it as "Not Planned".
In response to this:
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs.
This bot triages issues according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied- After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied- After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closedYou can:
- Reopen this issue with
/reopen- Mark this issue as fresh with
/remove-lifecycle rotten- Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/close not-planned
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.
I have the same issue, can this please be reopened?
There are similar issues that were also closed with no solution #1995 #1250
INFO [2024-04-10T16:06:38Z] ensuring security group ingress=REDACTED
INFO [2024-04-10T16:06:38Z] ensured security group ingress=REDACTED
INFO [2024-04-10T16:06:38Z] secret created in Barbican ingress=REDACTED secretName=REDACTED secretRef=REDACTED
INFO [2024-04-10T16:06:38Z] creating listener lbID=REDACTED listenerName=REDACTED
E0410 16:06:41.106992 12 controller.go:548] failed to create openstack resources for ingress REDACTED: error creating listener: Bad request with: [POST https://REDACTED/v2.0/lbaas/listeners], error message: {"faultcode": "Client", "faultstring": "Could not retrieve certificate: ['https://REDACTED/v1/secrets/REDACTED', 'https://REDACTEDl/v1/secrets/REDACTED']", "debuginfo": null}
I0410 16:06:41.107218 12 event.go:364] Event(v1.ObjectReference{Kind:"Ingress", Namespace:"REDACTED", Name:"REDACTED", UID:"REDACTED", APIVersion:"networking.k8s.io/v1", ResourceVersion:"12066708", FieldPath:""}): type: 'Warning' reason: 'Failed' Failed to create openstack resources for ingress REDACTED: error creating listener: Bad request with: [POST https://REDACTED/v2.0/lbaas/listeners], error message: {"faultcode": "Client", "faultstring": "Could not retrieve certificate: ['https://REDACTED/v1/secrets/REDACTED', 'https://REDACTED/v1/secrets/REDACTED']", "debuginfo": null}
Sure. This is an error on the Octavia side, can you provide the logs of octavia-api for this POST request to see the root cause?
Here are the related logs.
Listener creation is failing with Not Found: Secrets container not found error in octavia side.
As you can see the following octavia and barbican log, there is a request for secret creation and the secret is created correctly on barbican but there is no request for secret container creation and there is no such a container existing in barbican.
octavia logs
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:28.877 10 ERROR barbicanclient.client [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] 4xx Client error: Not Found: Secrets container not found.
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:28.878 10 INFO octavia.certificates.manager.barbican [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Loading certificate secret https://key-manager.openstack.vistex.local/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 from Barbican.
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:28.879 10 INFO barbicanclient.base [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Calculated Secrets uuid ref: secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:28.922 10 INFO barbicanclient.base [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Calculated Secrets uuid ref: secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:28.975 10 INFO octavia.certificates.manager.barbican_legacy [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Loading certificate container https://key-manager.openstack.vistex.local/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 from Barbican.
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:28.976 10 INFO barbicanclient.base [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Calculated Containers uuid ref: containers/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:29.001 10 ERROR barbicanclient.client [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] 4xx Client error: Not Found: Secrets container not found.
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:29.002 10 ERROR octavia.certificates.manager.barbican_legacy [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Error getting cert https://key-manager.openstack.vistex.local/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5: Not Found: Secrets container not found.: barbicanclient.exceptions.HTTPClientError: Not Found: Secrets container not found.
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:29.832 10 INFO barbicanclient.base [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Calculated Containers uuid ref: containers/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:29.853 10 ERROR barbicanclient.client [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] 4xx Client error: Not Found: Secrets container not found.
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:29.853 10 INFO octavia.certificates.manager.barbican [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Loading certificate secret https://key-manager.openstack.vistex.local/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 from Barbican.
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:29.854 10 INFO barbicanclient.base [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Calculated Secrets uuid ref: secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:29.896 10 INFO barbicanclient.base [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Calculated Secrets uuid ref: secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:29.937 10 INFO octavia.certificates.manager.barbican_legacy [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Loading certificate container https://key-manager.openstack.vistex.local/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 from Barbican.
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:29.938 10 INFO barbicanclient.base [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Calculated Containers uuid ref: containers/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:29.963 10 ERROR barbicanclient.client [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] 4xx Client error: Not Found: Secrets container not found.
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: 2024-04-25 17:23:29.964 10 ERROR octavia.certificates.manager.barbican_legacy [None req-d077039c-d0e5-45c5-be3e-bd309c430cbe - 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Error getting cert https://key-manager.openstack.vistex.local/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5: Not Found: Secrets container not found.: barbicanclient.exceptions.HTTPClientError: Not Found: Secrets container not found.
openstack/octavia-api-788f56d755-hr9dq[octavia-api]: [pid: 10|app: 0|req: 20835/83328] 172.22.194.57 () {48 vars in 1032 bytes} [Thu Apr 25 17:23:26 2024] POST /v2.0/lbaas/listeners => generated 281 bytes in 3031 msecs (HTTP/1.1 400) 4 headers in 166 bytes (1 switches on core 0)
barbican log
openstack/barbican-api-5974877d56-krjrz[barbican-api]: 2024-04-25 17:23:26.900 9 INFO barbican.api.middleware.context [None req-b2f204a1-9e5f-48b3-bf87-027c174d6e8c 11b044fefa3521e7715702453170b35d73a20ad85a470e48f935fba5b30c4b67 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Processed request: 201 Created - POST https://key-manager.openstack.vistex.local/v1/secrets
openstack/barbican-api-5974877d56-krjrz[barbican-api]: {address space usage: 132620288 bytes/126MB} {rss usage: 108310528 bytes/103MB} [pid: 9|app: 0|req: 49655/49655] 10.0.2.54 () {48 vars in 1010 bytes} [Thu Apr 25 17:23:26 2024] POST /v1/secrets => generated 108 bytes in 83 msecs (HTTP/1.1 201) 5 headers in 264 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-sthtq[barbican-api]: {address space usage: 132628480 bytes/126MB} {rss usage: 108310528 bytes/103MB} [pid: 9|app: 0|req: 49661/49661] 10.0.1.88 () {28 vars in 434 bytes} [Thu Apr 25 17:23:27 2024] GET / => generated 334 bytes in 1 msecs (HTTP/1.1 300) 3 headers in 105 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-sthtq[barbican-api]: {address space usage: 132628480 bytes/126MB} {rss usage: 108310528 bytes/103MB} [pid: 9|app: 0|req: 49662/49662] 10.0.1.88 () {32 vars in 747 bytes} [Thu Apr 25 17:23:27 2024] GET / => generated 334 bytes in 1 msecs (HTTP/1.1 300) 3 headers in 105 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-krjrz[barbican-api]: 2024-04-25 17:23:27.798 9 INFO barbican.api.middleware.context [None req-b2f204a1-9e5f-48b3-bf87-027c174d6e8c 11b044fefa3521e7715702453170b35d73a20ad85a470e48f935fba5b30c4b67 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Begin processing request req-62f02d98-d414-4187-b219-790bc392015c
openstack/barbican-api-5974877d56-krjrz[barbican-api]: 2024-04-25 17:23:27.818 9 INFO barbican.api.middleware.context [None req-4ffab2b2-e05a-4548-9402-0c2b37c041e9 11b044fefa3521e7715702453170b35d73a20ad85a470e48f935fba5b30c4b67 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Processed request: 200 OK - GET http://barbican-api.openstack.svc.cluster.local:9311/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5/acl
openstack/barbican-api-5974877d56-krjrz[barbican-api]: {address space usage: 132620288 bytes/126MB} {rss usage: 108310528 bytes/103MB} [pid: 9|app: 0|req: 49656/49656] 10.0.1.88 () {32 vars in 849 bytes} [Thu Apr 25 17:23:27 2024] GET /v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5/acl => generated 34 bytes in 23 msecs (HTTP/1.1 200) 4 headers in 156 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: 2024-04-25 17:23:27.833 9 INFO barbican.api.middleware.context [None req-f4c51c5c-59e8-42c3-9328-b71236f8fb0c 11b044fefa3521e7715702453170b35d73a20ad85a470e48f935fba5b30c4b67 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Begin processing request req-cdede930-d301-4c52-8a0f-e66895daf806
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: 2024-04-25 17:23:27.866 9 INFO barbican.api.middleware.context [None req-c1bf13f9-1e1e-4dfe-b31e-a9f33ab7c98b 11b044fefa3521e7715702453170b35d73a20ad85a470e48f935fba5b30c4b67 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Processed request: 200 OK - PUT http://barbican-api.openstack.svc.cluster.local:9311/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5/acl
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: {address space usage: 132464640 bytes/126MB} {rss usage: 110899200 bytes/105MB} [pid: 9|app: 0|req: 50107/50107] 10.0.1.88 () {36 vars in 888 bytes} [Thu Apr 25 17:23:27 2024] PUT /v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5/acl => generated 109 bytes in 38 msecs (HTTP/1.1 200) 4 headers in 157 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-krjrz[barbican-api]: {address space usage: 132620288 bytes/126MB} {rss usage: 108310528 bytes/103MB} [pid: 9|app: 0|req: 49657/49657] 10.0.1.88 () {28 vars in 434 bytes} [Thu Apr 25 17:23:28 2024] GET / => generated 334 bytes in 2 msecs (HTTP/1.1 300) 3 headers in 105 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: 2024-04-25 17:23:28.836 9 INFO barbican.api.middleware.context [None req-c1bf13f9-1e1e-4dfe-b31e-a9f33ab7c98b 11b044fefa3521e7715702453170b35d73a20ad85a470e48f935fba5b30c4b67 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Begin processing request req-d7bb6b14-dd9d-4680-97c0-7346311fe450
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: 2024-04-25 17:23:28.838 9 INFO barbican.api.middleware.context [None req-bf0ec8ff-f0cb-4c7c-994b-bf67614608ec dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Processed request: 200 OK - GET http://barbican-api.openstack.svc.cluster.local:9311/v1/
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: {address space usage: 132464640 bytes/126MB} {rss usage: 110968832 bytes/105MB} [pid: 9|app: 0|req: 50108/50108] 10.0.1.88 () {32 vars in 689 bytes} [Thu Apr 25 17:23:28 2024] GET /v1/ => generated 319 bytes in 559 msecs (HTTP/1.1 200) 4 headers in 157 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-sthtq[barbican-api]: 2024-04-25 17:23:28.856 9 INFO barbican.api.middleware.context [None req-a2aa8f8e-7c93-4147-890b-31f725715a44 11b044fefa3521e7715702453170b35d73a20ad85a470e48f935fba5b30c4b67 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Begin processing request req-01c5102f-fa07-4c71-b5a8-34988d2e415c
openstack/barbican-api-5974877d56-sthtq[barbican-api]: 2024-04-25 17:23:28.874 9 INFO barbican.api.middleware.context [None req-c2d3d5bd-6f0c-4d0e-8202-f408ee5de96c dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Processed request: 404 Not Found - GET http://barbican-api.openstack.svc.cluster.local:9311/v1/containers/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/barbican-api-5974877d56-sthtq[barbican-api]: {address space usage: 132628480 bytes/126MB} {rss usage: 108310528 bytes/103MB} [pid: 9|app: 0|req: 49663/49663] 10.0.1.88 () {32 vars in 783 bytes} [Thu Apr 25 17:23:28 2024] GET /v1/containers/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 => generated 82 bytes in 27 msecs (HTTP/1.1 404) 4 headers in 163 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-sthtq[barbican-api]: 2024-04-25 17:23:28.891 9 INFO barbican.api.middleware.context [None req-c2d3d5bd-6f0c-4d0e-8202-f408ee5de96c dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Begin processing request req-519f67ba-caa6-4031-b79d-478a4d211138
openstack/barbican-api-5974877d56-sthtq[barbican-api]: 2024-04-25 17:23:28.917 9 INFO barbican.api.controllers.secrets [None req-3e879808-7697-435e-b068-861ac491ac2e dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Retrieved secret metadata for project: f5ed2d21437644adb2669f9ade9c949b
openstack/barbican-api-5974877d56-sthtq[barbican-api]: 2024-04-25 17:23:28.919 9 INFO barbican.api.middleware.context [None req-3e879808-7697-435e-b068-861ac491ac2e dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Processed request: 200 OK - GET http://barbican-api.openstack.svc.cluster.local:9311/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/barbican-api-5974877d56-sthtq[barbican-api]: {address space usage: 132628480 bytes/126MB} {rss usage: 108453888 bytes/103MB} [pid: 9|app: 0|req: 49664/49664] 10.0.1.88 () {32 vars in 777 bytes} [Thu Apr 25 17:23:28 2024] GET /v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 => generated 516 bytes in 31 msecs (HTTP/1.1 200) 4 headers in 157 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: 2024-04-25 17:23:28.936 9 INFO barbican.api.middleware.context [None req-bf0ec8ff-f0cb-4c7c-994b-bf67614608ec dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Begin processing request req-a6971c07-5585-4bf5-bdb9-a9bab1a59cb1
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: 2024-04-25 17:23:28.971 9 INFO barbican.api.controllers.secrets [None req-0ee4c534-1288-4186-bd25-2007364f969f dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Retrieved secret payload for project: f5ed2d21437644adb2669f9ade9c949b
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: 2024-04-25 17:23:28.974 9 INFO barbican.api.middleware.context [None req-0ee4c534-1288-4186-bd25-2007364f969f dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Processed request: 200 OK - GET http://barbican-api.openstack.svc.cluster.local:9311/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5/payload
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: {address space usage: 133513216 bytes/127MB} {rss usage: 111341568 bytes/106MB} [pid: 9|app: 0|req: 50109/50109] 10.0.1.88 () {32 vars in 801 bytes} [Thu Apr 25 17:23:28 2024] GET /v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5/payload => generated 2495 bytes in 42 msecs (HTTP/1.1 200) 4 headers in 166 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-krjrz[barbican-api]: 2024-04-25 17:23:28.991 9 INFO barbican.api.middleware.context [None req-4ffab2b2-e05a-4548-9402-0c2b37c041e9 11b044fefa3521e7715702453170b35d73a20ad85a470e48f935fba5b30c4b67 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Begin processing request req-cd482282-09b2-4062-8afa-e92ec073a9a6
openstack/barbican-api-5974877d56-krjrz[barbican-api]: 2024-04-25 17:23:28.999 9 INFO barbican.api.middleware.context [None req-39528030-4fa1-4595-ac45-bdd65ebf8738 dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Processed request: 404 Not Found - GET http://barbican-api.openstack.svc.cluster.local:9311/v1/containers/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/barbican-api-5974877d56-krjrz[barbican-api]: {address space usage: 132620288 bytes/126MB} {rss usage: 108310528 bytes/103MB} [pid: 9|app: 0|req: 49658/49658] 10.0.1.88 () {32 vars in 783 bytes} [Thu Apr 25 17:23:28 2024] GET /v1/containers/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 => generated 82 bytes in 12 msecs (HTTP/1.1 404) 4 headers in 163 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: {address space usage: 133513216 bytes/127MB} {rss usage: 111341568 bytes/106MB} [pid: 9|app: 0|req: 50110/50110] 10.0.1.88 () {28 vars in 434 bytes} [Thu Apr 25 17:23:29 2024] GET / => generated 334 bytes in 2 msecs (HTTP/1.1 300) 3 headers in 105 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: {address space usage: 133513216 bytes/127MB} {rss usage: 111341568 bytes/106MB} [pid: 9|app: 0|req: 50111/50111] 10.0.1.88 () {32 vars in 747 bytes} [Thu Apr 25 17:23:29 2024] GET / => generated 334 bytes in 1 msecs (HTTP/1.1 300) 3 headers in 105 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-krjrz[barbican-api]: 2024-04-25 17:23:29.812 9 INFO barbican.api.middleware.context [None req-39528030-4fa1-4595-ac45-bdd65ebf8738 dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Begin processing request req-1c0c26ef-8870-452b-a140-a5a54773662e
openstack/barbican-api-5974877d56-krjrz[barbican-api]: 2024-04-25 17:23:29.830 9 INFO barbican.api.middleware.context [None req-2c09e817-3d90-45f4-a3ce-4cdffe8bde18 11b044fefa3521e7715702453170b35d73a20ad85a470e48f935fba5b30c4b67 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Processed request: 200 OK - GET http://barbican-api.openstack.svc.cluster.local:9311/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5/acl
openstack/barbican-api-5974877d56-krjrz[barbican-api]: {address space usage: 132620288 bytes/126MB} {rss usage: 108310528 bytes/103MB} [pid: 9|app: 0|req: 49659/49659] 10.0.1.88 () {32 vars in 849 bytes} [Thu Apr 25 17:23:29 2024] GET /v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5/acl => generated 149 bytes in 21 msecs (HTTP/1.1 200) 4 headers in 157 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-krjrz[barbican-api]: 2024-04-25 17:23:29.844 9 INFO barbican.api.middleware.context [None req-2c09e817-3d90-45f4-a3ce-4cdffe8bde18 11b044fefa3521e7715702453170b35d73a20ad85a470e48f935fba5b30c4b67 972a7d5f51144b3aaed7d975aa408212 - - e14efee7c7714996884be538afa52b0c e14efee7c7714996884be538afa52b0c] Begin processing request req-478fcc6f-c02d-4299-81ec-0c48b03625a9
openstack/barbican-api-5974877d56-krjrz[barbican-api]: 2024-04-25 17:23:29.851 9 INFO barbican.api.middleware.context [None req-cefcc917-979d-41d9-bbcb-af352c60d1b5 dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Processed request: 404 Not Found - GET http://barbican-api.openstack.svc.cluster.local:9311/v1/containers/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/barbican-api-5974877d56-krjrz[barbican-api]: {address space usage: 132620288 bytes/126MB} {rss usage: 108310528 bytes/103MB} [pid: 9|app: 0|req: 49660/49660] 10.0.1.88 () {32 vars in 783 bytes} [Thu Apr 25 17:23:29 2024] GET /v1/containers/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 => generated 82 bytes in 10 msecs (HTTP/1.1 404) 4 headers in 163 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-sthtq[barbican-api]: 2024-04-25 17:23:29.866 9 INFO barbican.api.middleware.context [None req-3e879808-7697-435e-b068-861ac491ac2e dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Begin processing request req-3e5d8673-04ca-4c3a-8c9f-85de154a76f1
openstack/barbican-api-5974877d56-sthtq[barbican-api]: 2024-04-25 17:23:29.890 9 INFO barbican.api.controllers.secrets [None req-63a12552-c19f-4583-9965-6eb81d6c8d76 dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Retrieved secret metadata for project: f5ed2d21437644adb2669f9ade9c949b
openstack/barbican-api-5974877d56-sthtq[barbican-api]: 2024-04-25 17:23:29.893 9 INFO barbican.api.middleware.context [None req-63a12552-c19f-4583-9965-6eb81d6c8d76 dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Processed request: 200 OK - GET http://barbican-api.openstack.svc.cluster.local:9311/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/barbican-api-5974877d56-sthtq[barbican-api]: {address space usage: 132628480 bytes/126MB} {rss usage: 108453888 bytes/103MB} [pid: 9|app: 0|req: 49665/49665] 10.0.1.88 () {32 vars in 777 bytes} [Thu Apr 25 17:23:29 2024] GET /v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 => generated 516 bytes in 31 msecs (HTTP/1.1 200) 4 headers in 157 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-krjrz[barbican-api]: 2024-04-25 17:23:29.907 9 INFO barbican.api.middleware.context [None req-cefcc917-979d-41d9-bbcb-af352c60d1b5 dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Begin processing request req-006bba2d-7c2a-4bfa-b709-c205752bb3cd
openstack/barbican-api-5974877d56-krjrz[barbican-api]: 2024-04-25 17:23:29.933 9 INFO barbican.api.controllers.secrets [None req-c59b8941-f303-4a5f-b3d5-ff3bd174e747 dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Retrieved secret payload for project: f5ed2d21437644adb2669f9ade9c949b
openstack/barbican-api-5974877d56-krjrz[barbican-api]: 2024-04-25 17:23:29.935 9 INFO barbican.api.middleware.context [None req-c59b8941-f303-4a5f-b3d5-ff3bd174e747 dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Processed request: 200 OK - GET http://barbican-api.openstack.svc.cluster.local:9311/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5/payload
openstack/barbican-api-5974877d56-krjrz[barbican-api]: {address space usage: 132620288 bytes/126MB} {rss usage: 108359680 bytes/103MB} [pid: 9|app: 0|req: 49661/49661] 10.0.1.88 () {32 vars in 801 bytes} [Thu Apr 25 17:23:29 2024] GET /v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5/payload => generated 2495 bytes in 31 msecs (HTTP/1.1 200) 4 headers in 166 bytes (1 switches on core 0)
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: 2024-04-25 17:23:29.954 9 INFO barbican.api.middleware.context [None req-0ee4c534-1288-4186-bd25-2007364f969f dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Begin processing request req-8e99582b-f0e7-4b41-aedc-c43fed310be8
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: 2024-04-25 17:23:29.963 9 INFO barbican.api.middleware.context [None req-0c34e080-6465-4e7f-9de5-16545e47a724 dd5664a9739d4afba929b37d3f123f28 f5ed2d21437644adb2669f9ade9c949b - - default default] Processed request: 404 Not Found - GET http://barbican-api.openstack.svc.cluster.local:9311/v1/containers/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
openstack/barbican-api-5974877d56-wlwvd[barbican-api]: {address space usage: 133513216 bytes/127MB} {rss usage: 111341568 bytes/106MB} [pid: 9|app: 0|req: 50112/50112] 10.0.1.88 () {32 vars in 783 bytes} [Thu Apr 25 17:23:29 2024] GET /v1/containers/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 => generated 82 bytes in 14 msecs (HTTP/1.1 404) 4 headers in 163 bytes (1 switches on core 0)
Hi @dulek did you have a chance to check the logs I provided?
I'm currently on holiday, will reply when I'm back.
@dulek did you have a chance to check?
@okozachenko1203
According to your logs, the secrets were created using the internal barbican endpoint and they're requested by OCCM using a public endpoint:
http://barbican-api.openstack.svc.cluster.local:9311/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
vs
https://key-manager.openstack.vistex.local/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
Can you try to use the keystone user configured in OCCM and check whether it can access http://barbican-api.openstack.svc.cluster.local:9311/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 and/or https://key-manager.openstack.vistex.local/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5?
$ openstack secret get %URL%
If there is a a permission issue, barbican would return 403. In your case you clearly get 404, which may indicate the reverse proxy or babrican webserver is configured with the wrong location/endpoint.
@okozachenko1203
According to your logs, the secrets were created using the internal barbican endpoint and they're requested by OCCM using a public endpoint:
http://barbican-api.openstack.svc.cluster.local:9311/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
vs
https://key-manager.openstack.vistex.local/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5
Can you try to use the keystone user configured in OCCM and check whether it can access
http://barbican-api.openstack.svc.cluster.local:9311/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5and/orhttps://key-manager.openstack.vistex.local/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5?$ openstack secret get %URL%If there is a a permission issue, barbican would return 403. In your case you clearly get 404, which may indicate the reverse proxy or babrican webserver is configured with the wrong location/endpoint.
@kayrus
- secret is not an issue. I can fetch the secret with both public and private auth url. Also octavia ingress controller can fetch the secret as you can see the log. The problem is the missing secret container.
- i can fetch the other pre-existing secret containers (not created by octavia-ingress controller) using both public and private href and auth url.
So the problem is that the secret container was not created by octavia ingress controller.
I see, so the problem is that modern octavia requires container, not a secret? And the container is not created by the ingress controller, right? Looks related to the #2461
I see, so the problem is that modern octavia requires container, not a secret? And the container is not created by the ingress controller, right? Looks related to the #2461
Yeah, it is right that the secret container is not created by ingress controller. But I don't think that issue #2461 is related to that.
It is failing at this line https://github.com/kubernetes/cloud-provider-openstack/blob/a59b8a28d23b1f265eb066e760b56d72ad29e91f/pkg/ingress/controller/controller.go#L776 https://github.com/kubernetes/cloud-provider-openstack/blob/64b813046f25b41aa4295a0a51726bcf25e92bc7/pkg/ingress/controller/openstack/octavia.go#L352-L364
The secret created by ingress controller before EnsureListener is called. https://github.com/kubernetes/cloud-provider-openstack/blob/a59b8a28d23b1f265eb066e760b56d72ad29e91f/pkg/ingress/controller/controller.go#L751-L762
INFO [2024-04-25T17:23:26Z] secret created in Barbican ingress=elkstack/logstash-logstash lbID=d37899b3-f0df-4c80-bdf8-2386f89875fa secretName=kube_ingress_vstx1-useg-k8s-1_elkstack_logstash-logstash_logstash-tls secretRef="https://key-manager.openstack.vistex.local/v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5"
As you can see the secret ref is /v1/secrets/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 and it is used as Listener createOpts. Now in octavia side, it calculated Containers uuid ref as containers/bfc96ae2-2f11-4d98-9094-8a89d0fcdcf5 which is not existing and tried to fetch it and finally failed.
So I wonder if that secret container creation should be triggered by ingress controller explicitly.
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs.
This bot triages issues according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied - After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied - After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closed
You can:
- Reopen this issue with
/reopen - Mark this issue as fresh with
/remove-lifecycle rotten - Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/close not-planned
@k8s-triage-robot: Closing this issue, marking it as "Not Planned".
In response to this:
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs.
This bot triages issues according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied- After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied- After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closedYou can:
- Reopen this issue with
/reopen- Mark this issue as fresh with
/remove-lifecycle rotten- Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/close not-planned
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.
/reopen /remove-lifecycle rotten
Hi folks, any update or thoughts here?