Testing testbed on Ubuntu 24.04
Preparing and Testing Testbed for Ubuntu 24.04 Compatibility
- Related issue: https://github.com/osism/issues/issues/1028 (provides context for this work)
- Dependency: https://github.com/osism/terraform-base/pull/54 (required changes for Ubuntu 24.04 support)
This task involves updating and testing the testbed to ensure compatibility with Ubuntu 24.04. We'll be verifying the deployment of various services and components.
Deployment Status
- [x] Manager
Services
- [x] Helper Services
- [x] Kubernetes
- [x] Ceph Services (Basic)
- [x] Infrastructure Services (Basic)
- [x] OpenStack Services (Basic)
- [x] Monitoring Services
failed: [testbed-node-0.testbed.osism.xyz] (item=grafana) => {"ansible_loop_var": "item", "attempts": 3, "changed": true, "item": {"key": "grafana", "value": {"container_name": "grafana", "dimensions": {}, "enabled": true, "group": "grafana", "haproxy": {"grafana_server": {"enabled": "yes", "external": false, "listen_port": "3000", "mode": "http", "port": "3000"}, "grafana_server_external": {"enabled": true, "external": true, "external_fqdn": "api.testbed.osism.xyz", "listen_port": "3000", "mode": "http", "port": "3000"}}, "image": "nexus.testbed.osism.xyz:8192/osism/grafana:2023.2", "volumes": ["/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "grafana:/var/lib/grafana/", "kolla_logs:/var/log/kolla/"]}}, "msg": "'Traceback (most recent call last):\
File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 214, in _retrieve_server_version\
return self.version(api_version=False)[\"ApiVersion\"]\
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\
File \"/usr/lib/python3/dist-packages/docker/api/daemon.py\", line 181, in version\
return self._result(self._get(url), json=True)\
^^^^^^^^^^^^^^\
File \"/usr/lib/python3/dist-packages/docker/utils/decorators.py\", line 46, in inner\
return f(self, *args, **kwargs)\
^^^^^^^^^^^^^^^^^^^^^^^^\
File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 237, in _get\
return self.get(url, **self._set_request_timeout(kwargs))\
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\
File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 602, in get\
return self.request(\"GET\", url, **kwargs)\
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\
File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 589, in request\
resp = self.send(prep, **send_kwargs)\
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\
File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 703, in send\
r = adapter.send(request, **kwargs)\
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\
File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 486, in send\
resp = conn.urlopen(\
^^^^^^^^^^^^^\
File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 791, in urlopen\
response = self._make_request(\
^^^^^^^^^^^^^^^^^^^\
File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 497, in _make_request\
conn.request(\
TypeError: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\
\
During handling of the above exception, another exception occurred:\
\
Traceback (most recent call last):\
File \"/tmp/ansible_kolla_container_payload_lhqwxanl/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 417, in main\
cw = DockerWorker(module)\
^^^^^^^^^^^^^^^^^^^^\
File \"/tmp/ansible_kolla_container_payload_lhqwxanl/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 39, in __init__\
self.dc = get_docker_client()(**options)\
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\
File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 197, in __init__\
self._version = self._retrieve_server_version()\
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\
File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 221, in _retrieve_server_version\
raise DockerException(\
docker.errors.DockerException: Error while fetching server API version: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\
'"}
PLAY [Apply role barbican] *****************************************************
TASK [barbican : include_tasks] ************************************************
Monday 27 May 2024 12:13:51 +0000 (0:00:05.975) 0:00:14.430 ************
included: /ansible/roles/barbican/tasks/pull.yml for testbed-node-0.testbed.osism.xyz, testbed-node-1.testbed.osism.xyz, testbed-node-2.testbed.osism.xyz
TASK [service-images-pull : barbican | Pull images] ****************************
Monday 27 May 2024 12:13:56 +0000 (0:00:04.656) 0:00:19.087 ************
FAILED - RETRYING: [testbed-node-0.testbed.osism.xyz]: barbican | Pull images (3 retries left).
FAILED - RETRYING: [testbed-node-2.testbed.osism.xyz]: barbican | Pull images (3 retries left).
FAILED - RETRYING: [testbed-node-1.testbed.osism.xyz]: barbican | Pull images (3 retries left).
FAILED - RETRYING: [testbed-node-2.testbed.osism.xyz]: barbican | Pull images (2 retries left).
FAILED - RETRYING: [testbed-node-0.testbed.osism.xyz]: barbican | Pull images (2 retries left).
FAILED - RETRYING: [testbed-node-1.testbed.osism.xyz]: barbican | Pull images (2 retries left).
FAILED - RETRYING: [testbed-node-2.testbed.osism.xyz]: barbican | Pull images (1 retries left).
FAILED - RETRYING: [testbed-node-0.testbed.osism.xyz]: barbican | Pull images (1 retries left).
FAILED - RETRYING: [testbed-node-1.testbed.osism.xyz]: barbican | Pull images (1 retries left).
failed: [testbed-node-2.testbed.osism.xyz] (item=barbican-api) => {"ansible_loop_var": "item", "attempts": 3, "changed": true, "item": {"key": "barbican-api", "value": {"container_name": "barbican_api", "dimensions": {}, "enabled": true, "group": "barbican-api", "haproxy": {"barbican_api": {"enabled": "yes", "external": false, "listen_port": "9311", "mode": "http", "port": "9311", "tls_backend": "no"}, "barbican_api_external": {"enabled": "yes", "external": true, "external_fqdn": "api.testbed.osism.xyz", "listen_port": "9311", "mode": "http", "port": "9311", "tls_backend": "no"}}, "healthcheck": {"interval": "30", "retries": "3", "start_period": "5", "test": ["CMD-SHELL", "healthcheck_curl http://192.168.16.12:9311"], "timeout": "30"}, "image": "nexus.testbed.osism.xyz:8192/osism/barbican-api:2023.2", "volumes": ["/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "barbican:/var/lib/barbican/", "kolla_logs:/var/log/kolla/", ""]}}, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 214, in _retrieve_server_version\\n return self.version(api_version=False)[\"ApiVersion\"]\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/daemon.py\", line 181, in version\\n return self._result(self._get(url), json=True)\\n ^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/utils/decorators.py\", line 46, in inner\\n return f(self, *args, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 237, in _get\\n return self.get(url, **self._set_request_timeout(kwargs))\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 602, in get\\n return self.request(\"GET\", url, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 589, in request\\n resp = self.send(prep, **send_kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 703, in send\\n r = adapter.send(request, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 486, in send\\n resp = conn.urlopen(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 791, in urlopen\\n response = self._make_request(\\n ^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 497, in _make_request\\n conn.request(\\nTypeError: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_i_vgrt5j/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 417, in main\\n cw = DockerWorker(module)\\n ^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_i_vgrt5j/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 39, in __init__\\n self.dc = get_docker_client()(**options)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 197, in __init__\\n self._version = self._retrieve_server_version()\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 221, in _retrieve_server_version\\n raise DockerException(\\ndocker.errors.DockerException: Error while fetching server API version: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n'"}
failed: [testbed-node-0.testbed.osism.xyz] (item=barbican-api) => {"ansible_loop_var": "item", "attempts": 3, "changed": true, "item": {"key": "barbican-api", "value": {"container_name": "barbican_api", "dimensions": {}, "enabled": true, "group": "barbican-api", "haproxy": {"barbican_api": {"enabled": "yes", "external": false, "listen_port": "9311", "mode": "http", "port": "9311", "tls_backend": "no"}, "barbican_api_external": {"enabled": "yes", "external": true, "external_fqdn": "api.testbed.osism.xyz", "listen_port": "9311", "mode": "http", "port": "9311", "tls_backend": "no"}}, "healthcheck": {"interval": "30", "retries": "3", "start_period": "5", "test": ["CMD-SHELL", "healthcheck_curl http://192.168.16.10:9311"], "timeout": "30"}, "image": "nexus.testbed.osism.xyz:8192/osism/barbican-api:2023.2", "volumes": ["/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "barbican:/var/lib/barbican/", "kolla_logs:/var/log/kolla/", ""]}}, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 214, in _retrieve_server_version\\n return self.version(api_version=False)[\"ApiVersion\"]\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/daemon.py\", line 181, in version\\n return self._result(self._get(url), json=True)\\n ^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/utils/decorators.py\", line 46, in inner\\n return f(self, *args, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 237, in _get\\n return self.get(url, **self._set_request_timeout(kwargs))\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 602, in get\\n return self.request(\"GET\", url, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 589, in request\\n resp = self.send(prep, **send_kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 703, in send\\n r = adapter.send(request, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 486, in send\\n resp = conn.urlopen(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 791, in urlopen\\n response = self._make_request(\\n ^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 497, in _make_request\\n conn.request(\\nTypeError: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_mysnhwrk/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 417, in main\\n cw = DockerWorker(module)\\n ^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_mysnhwrk/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 39, in __init__\\n self.dc = get_docker_client()(**options)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 197, in __init__\\n self._version = self._retrieve_server_version()\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 221, in _retrieve_server_version\\n raise DockerException(\\ndocker.errors.DockerException: Error while fetching server API version: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n'"}
failed: [testbed-node-1.testbed.osism.xyz] (item=barbican-api) => {"ansible_loop_var": "item", "attempts": 3, "changed": true, "item": {"key": "barbican-api", "value": {"container_name": "barbican_api", "dimensions": {}, "enabled": true, "group": "barbican-api", "haproxy": {"barbican_api": {"enabled": "yes", "external": false, "listen_port": "9311", "mode": "http", "port": "9311", "tls_backend": "no"}, "barbican_api_external": {"enabled": "yes", "external": true, "external_fqdn": "api.testbed.osism.xyz", "listen_port": "9311", "mode": "http", "port": "9311", "tls_backend": "no"}}, "healthcheck": {"interval": "30", "retries": "3", "start_period": "5", "test": ["CMD-SHELL", "healthcheck_curl http://192.168.16.11:9311"], "timeout": "30"}, "image": "nexus.testbed.osism.xyz:8192/osism/barbican-api:2023.2", "volumes": ["/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "barbican:/var/lib/barbican/", "kolla_logs:/var/log/kolla/", ""]}}, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 214, in _retrieve_server_version\\n return self.version(api_version=False)[\"ApiVersion\"]\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/daemon.py\", line 181, in version\\n return self._result(self._get(url), json=True)\\n ^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/utils/decorators.py\", line 46, in inner\\n return f(self, *args, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 237, in _get\\n return self.get(url, **self._set_request_timeout(kwargs))\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 602, in get\\n return self.request(\"GET\", url, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 589, in request\\n resp = self.send(prep, **send_kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 703, in send\\n r = adapter.send(request, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 486, in send\\n resp = conn.urlopen(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 791, in urlopen\\n response = self._make_request(\\n ^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 497, in _make_request\\n conn.request(\\nTypeError: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_qb7ya7rg/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 417, in main\\n cw = DockerWorker(module)\\n ^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_qb7ya7rg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 39, in __init__\\n self.dc = get_docker_client()(**options)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 197, in __init__\\n self._version = self._retrieve_server_version()\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 221, in _retrieve_server_version\\n raise DockerException(\\ndocker.errors.DockerException: Error while fetching server API version: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n'"}
FAILED - RETRYING: [testbed-node-2.testbed.osism.xyz]: barbican | Pull images (3 retries left).
FAILED - RETRYING: [testbed-node-0.testbed.osism.xyz]: barbican | Pull images (3 retries left).
FAILED - RETRYING: [testbed-node-1.testbed.osism.xyz]: barbican | Pull images (3 retries left).
FAILED - RETRYING: [testbed-node-2.testbed.osism.xyz]: barbican | Pull images (2 retries left).
FAILED - RETRYING: [testbed-node-0.testbed.osism.xyz]: barbican | Pull images (2 retries left).
FAILED - RETRYING: [testbed-node-1.testbed.osism.xyz]: barbican | Pull images (2 retries left).
FAILED - RETRYING: [testbed-node-2.testbed.osism.xyz]: barbican | Pull images (1 retries left).
FAILED - RETRYING: [testbed-node-0.testbed.osism.xyz]: barbican | Pull images (1 retries left).
FAILED - RETRYING: [testbed-node-1.testbed.osism.xyz]: barbican | Pull images (1 retries left).
failed: [testbed-node-2.testbed.osism.xyz] (item=barbican-keystone-listener) => {"ansible_loop_var": "item", "attempts": 3, "changed": true, "item": {"key": "barbican-keystone-listener", "value": {"container_name": "barbican_keystone_listener", "dimensions": {}, "enabled": true, "group": "barbican-keystone-listener", "healthcheck": {"interval": "30", "retries": "3", "start_period": "5", "test": ["CMD-SHELL", "healthcheck_port barbican-keystone-listener 5672"], "timeout": "30"}, "image": "nexus.testbed.osism.xyz:8192/osism/barbican-keystone-listener:2023.2", "volumes": ["/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", ""]}}, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 214, in _retrieve_server_version\\n return self.version(api_version=False)[\"ApiVersion\"]\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/daemon.py\", line 181, in version\\n return self._result(self._get(url), json=True)\\n ^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/utils/decorators.py\", line 46, in inner\\n return f(self, *args, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 237, in _get\\n return self.get(url, **self._set_request_timeout(kwargs))\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 602, in get\\n return self.request(\"GET\", url, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 589, in request\\n resp = self.send(prep, **send_kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 703, in send\\n r = adapter.send(request, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 486, in send\\n resp = conn.urlopen(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 791, in urlopen\\n response = self._make_request(\\n ^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 497, in _make_request\\n conn.request(\\nTypeError: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_2587nkcw/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 417, in main\\n cw = DockerWorker(module)\\n ^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_2587nkcw/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 39, in __init__\\n self.dc = get_docker_client()(**options)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 197, in __init__\\n self._version = self._retrieve_server_version()\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 221, in _retrieve_server_version\\n raise DockerException(\\ndocker.errors.DockerException: Error while fetching server API version: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n'"}
failed: [testbed-node-0.testbed.osism.xyz] (item=barbican-keystone-listener) => {"ansible_loop_var": "item", "attempts": 3, "changed": true, "item": {"key": "barbican-keystone-listener", "value": {"container_name": "barbican_keystone_listener", "dimensions": {}, "enabled": true, "group": "barbican-keystone-listener", "healthcheck": {"interval": "30", "retries": "3", "start_period": "5", "test": ["CMD-SHELL", "healthcheck_port barbican-keystone-listener 5672"], "timeout": "30"}, "image": "nexus.testbed.osism.xyz:8192/osism/barbican-keystone-listener:2023.2", "volumes": ["/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", ""]}}, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 214, in _retrieve_server_version\\n return self.version(api_version=False)[\"ApiVersion\"]\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/daemon.py\", line 181, in version\\n return self._result(self._get(url), json=True)\\n ^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/utils/decorators.py\", line 46, in inner\\n return f(self, *args, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 237, in _get\\n return self.get(url, **self._set_request_timeout(kwargs))\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 602, in get\\n return self.request(\"GET\", url, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 589, in request\\n resp = self.send(prep, **send_kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 703, in send\\n r = adapter.send(request, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 486, in send\\n resp = conn.urlopen(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 791, in urlopen\\n response = self._make_request(\\n ^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 497, in _make_request\\n conn.request(\\nTypeError: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_99599g6i/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 417, in main\\n cw = DockerWorker(module)\\n ^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_99599g6i/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 39, in __init__\\n self.dc = get_docker_client()(**options)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 197, in __init__\\n self._version = self._retrieve_server_version()\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 221, in _retrieve_server_version\\n raise DockerException(\\ndocker.errors.DockerException: Error while fetching server API version: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n'"}
FAILED - RETRYING: [testbed-node-2.testbed.osism.xyz]: barbican | Pull images (3 retries left).
failed: [testbed-node-1.testbed.osism.xyz] (item=barbican-keystone-listener) => {"ansible_loop_var": "item", "attempts": 3, "changed": true, "item": {"key": "barbican-keystone-listener", "value": {"container_name": "barbican_keystone_listener", "dimensions": {}, "enabled": true, "group": "barbican-keystone-listener", "healthcheck": {"interval": "30", "retries": "3", "start_period": "5", "test": ["CMD-SHELL", "healthcheck_port barbican-keystone-listener 5672"], "timeout": "30"}, "image": "nexus.testbed.osism.xyz:8192/osism/barbican-keystone-listener:2023.2", "volumes": ["/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", ""]}}, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 214, in _retrieve_server_version\\n return self.version(api_version=False)[\"ApiVersion\"]\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/daemon.py\", line 181, in version\\n return self._result(self._get(url), json=True)\\n ^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/utils/decorators.py\", line 46, in inner\\n return f(self, *args, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 237, in _get\\n return self.get(url, **self._set_request_timeout(kwargs))\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 602, in get\\n return self.request(\"GET\", url, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 589, in request\\n resp = self.send(prep, **send_kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 703, in send\\n r = adapter.send(request, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 486, in send\\n resp = conn.urlopen(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 791, in urlopen\\n response = self._make_request(\\n ^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 497, in _make_request\\n conn.request(\\nTypeError: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_6kd44zqs/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 417, in main\\n cw = DockerWorker(module)\\n ^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_6kd44zqs/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 39, in __init__\\n self.dc = get_docker_client()(**options)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 197, in __init__\\n self._version = self._retrieve_server_version()\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 221, in _retrieve_server_version\\n raise DockerException(\\ndocker.errors.DockerException: Error while fetching server API version: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n'"}
FAILED - RETRYING: [testbed-node-0.testbed.osism.xyz]: barbican | Pull images (3 retries left).
FAILED - RETRYING: [testbed-node-1.testbed.osism.xyz]: barbican | Pull images (3 retries left).
FAILED - RETRYING: [testbed-node-2.testbed.osism.xyz]: barbican | Pull images (2 retries left).
FAILED - RETRYING: [testbed-node-0.testbed.osism.xyz]: barbican | Pull images (2 retries left).
FAILED - RETRYING: [testbed-node-1.testbed.osism.xyz]: barbican | Pull images (2 retries left).
FAILED - RETRYING: [testbed-node-2.testbed.osism.xyz]: barbican | Pull images (1 retries left).
FAILED - RETRYING: [testbed-node-1.testbed.osism.xyz]: barbican | Pull images (1 retries left).
FAILED - RETRYING: [testbed-node-0.testbed.osism.xyz]: barbican | Pull images (1 retries left).
failed: [testbed-node-2.testbed.osism.xyz] (item=barbican-worker) => {"ansible_loop_var": "item", "attempts": 3, "changed": true, "item": {"key": "barbican-worker", "value": {"container_name": "barbican_worker", "dimensions": {}, "enabled": true, "group": "barbican-worker", "healthcheck": {"interval": "30", "retries": "3", "start_period": "5", "test": ["CMD-SHELL", "healthcheck_port barbican-worker 5672"], "timeout": "30"}, "image": "nexus.testbed.osism.xyz:8192/osism/barbican-worker:2023.2", "volumes": ["/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", ""]}}, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 214, in _retrieve_server_version\\n return self.version(api_version=False)[\"ApiVersion\"]\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/daemon.py\", line 181, in version\\n return self._result(self._get(url), json=True)\\n ^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/utils/decorators.py\", line 46, in inner\\n return f(self, *args, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 237, in _get\\n return self.get(url, **self._set_request_timeout(kwargs))\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 602, in get\\n return self.request(\"GET\", url, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 589, in request\\n resp = self.send(prep, **send_kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 703, in send\\n r = adapter.send(request, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 486, in send\\n resp = conn.urlopen(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 791, in urlopen\\n response = self._make_request(\\n ^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 497, in _make_request\\n conn.request(\\nTypeError: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_ncx7gbk0/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 417, in main\\n cw = DockerWorker(module)\\n ^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_ncx7gbk0/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 39, in __init__\\n self.dc = get_docker_client()(**options)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 197, in __init__\\n self._version = self._retrieve_server_version()\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 221, in _retrieve_server_version\\n raise DockerException(\\ndocker.errors.DockerException: Error while fetching server API version: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n'"}
failed: [testbed-node-1.testbed.osism.xyz] (item=barbican-worker) => {"ansible_loop_var": "item", "attempts": 3, "changed": true, "item": {"key": "barbican-worker", "value": {"container_name": "barbican_worker", "dimensions": {}, "enabled": true, "group": "barbican-worker", "healthcheck": {"interval": "30", "retries": "3", "start_period": "5", "test": ["CMD-SHELL", "healthcheck_port barbican-worker 5672"], "timeout": "30"}, "image": "nexus.testbed.osism.xyz:8192/osism/barbican-worker:2023.2", "volumes": ["/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", ""]}}, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 214, in _retrieve_server_version\\n return self.version(api_version=False)[\"ApiVersion\"]\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/daemon.py\", line 181, in version\\n return self._result(self._get(url), json=True)\\n ^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/utils/decorators.py\", line 46, in inner\\n return f(self, *args, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 237, in _get\\n return self.get(url, **self._set_request_timeout(kwargs))\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 602, in get\\n return self.request(\"GET\", url, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 589, in request\\n resp = self.send(prep, **send_kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 703, in send\\n r = adapter.send(request, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 486, in send\\n resp = conn.urlopen(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 791, in urlopen\\n response = self._make_request(\\n ^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 497, in _make_request\\n conn.request(\\nTypeError: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_l_a72b_e/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 417, in main\\n cw = DockerWorker(module)\\n ^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_l_a72b_e/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 39, in __init__\\n self.dc = get_docker_client()(**options)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 197, in __init__\\n self._version = self._retrieve_server_version()\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 221, in _retrieve_server_version\\n raise DockerException(\\ndocker.errors.DockerException: Error while fetching server API version: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n'"}
failed: [testbed-node-0.testbed.osism.xyz] (item=barbican-worker) => {"ansible_loop_var": "item", "attempts": 3, "changed": true, "item": {"key": "barbican-worker", "value": {"container_name": "barbican_worker", "dimensions": {}, "enabled": true, "group": "barbican-worker", "healthcheck": {"interval": "30", "retries": "3", "start_period": "5", "test": ["CMD-SHELL", "healthcheck_port barbican-worker 5672"], "timeout": "30"}, "image": "nexus.testbed.osism.xyz:8192/osism/barbican-worker:2023.2", "volumes": ["/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", ""]}}, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 214, in _retrieve_server_version\\n return self.version(api_version=False)[\"ApiVersion\"]\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/daemon.py\", line 181, in version\\n return self._result(self._get(url), json=True)\\n ^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/utils/decorators.py\", line 46, in inner\\n return f(self, *args, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 237, in _get\\n return self.get(url, **self._set_request_timeout(kwargs))\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 602, in get\\n return self.request(\"GET\", url, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 589, in request\\n resp = self.send(prep, **send_kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 703, in send\\n r = adapter.send(request, **kwargs)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 486, in send\\n resp = conn.urlopen(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 791, in urlopen\\n response = self._make_request(\\n ^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 497, in _make_request\\n conn.request(\\nTypeError: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_92o5egp9/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 417, in main\\n cw = DockerWorker(module)\\n ^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_92o5egp9/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 39, in __init__\\n self.dc = get_docker_client()(**options)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 197, in __init__\\n self._version = self._retrieve_server_version()\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 221, in _retrieve_server_version\\n raise DockerException(\\ndocker.errors.DockerException: Error while fetching server API version: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\\n'"}
PLAY RECAP *********************************************************************
2024-05-27 12:14:57 | INFO | Play has been completed. There may now be a delay until all logs have been written.
2024-05-27 12:14:57 | INFO | Please wait and do not abort execution.
testbed-node-0.testbed.osism.xyz : ok=3 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
testbed-node-1.testbed.osism.xyz : ok=3 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
testbed-node-2.testbed.osism.xyz : ok=3 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
Monday 27 May 2024 12:14:57 +0000 (0:01:01.129) 0:01:20.216 ************
===============================================================================
service-images-pull : barbican | Pull images --------------------------- 61.13s
Group hosts based on enabled services ----------------------------------- 5.98s
barbican : include_tasks ------------------------------------------------ 4.66s
Group hosts based on Kolla action --------------------------------------- 4.15s
failed: [testbed-node-0.testbed.osism.xyz] (item=grafana) => {"ansible_loop_var": "item", "attempts": 3, "changed": true, "item": {"key": "grafana", "value": {"container_name": "grafana", "dimensions": {}, "enabled": true, "group": "grafana", "haproxy": {"grafana_server": {"enabled": "yes", "external": false, "listen_port": "3000", "mode": "http", "port": "3000"}, "grafana_server_external": {"enabled": true, "external": true, "external_fqdn": "api.testbed.osism.xyz", "listen_port": "3000", "mode": "http", "port": "3000"}}, "image": "nexus.testbed.osism.xyz:8192/osism/grafana:2023.2", "volumes": ["/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "grafana:/var/lib/grafana/", "kolla_logs:/var/log/kolla/"]}}, "msg": "'Traceback (most recent call last):\ File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 214, in _retrieve_server_version\ return self.version(api_version=False)[\"ApiVersion\"]\ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\ File \"/usr/lib/python3/dist-packages/docker/api/daemon.py\", line 181, in version\ return self._result(self._get(url), json=True)\ ^^^^^^^^^^^^^^\ File \"/usr/lib/python3/dist-packages/docker/utils/decorators.py\", line 46, in inner\ return f(self, *args, **kwargs)\ ^^^^^^^^^^^^^^^^^^^^^^^^\ File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 237, in _get\ return self.get(url, **self._set_request_timeout(kwargs))\ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\ File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 602, in get\ return self.request(\"GET\", url, **kwargs)\ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\ File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 589, in request\ resp = self.send(prep, **send_kwargs)\ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\ File \"/usr/lib/python3/dist-packages/requests/sessions.py\", line 703, in send\ r = adapter.send(request, **kwargs)\ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\ File \"/usr/lib/python3/dist-packages/requests/adapters.py\", line 486, in send\ resp = conn.urlopen(\ ^^^^^^^^^^^^^\ File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 791, in urlopen\ response = self._make_request(\ ^^^^^^^^^^^^^^^^^^^\ File \"/usr/lib/python3/dist-packages/urllib3/connectionpool.py\", line 497, in _make_request\ conn.request(\ TypeError: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\ \ During handling of the above exception, another exception occurred:\ \ Traceback (most recent call last):\ File \"/tmp/ansible_kolla_container_payload_lhqwxanl/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 417, in main\ cw = DockerWorker(module)\ ^^^^^^^^^^^^^^^^^^^^\ File \"/tmp/ansible_kolla_container_payload_lhqwxanl/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 39, in __init__\ self.dc = get_docker_client()(**options)\ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\ File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 197, in __init__\ self._version = self._retrieve_server_version()\ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\ File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 221, in _retrieve_server_version\ raise DockerException(\ docker.errors.DockerException: Error while fetching server API version: HTTPConnection.request() got an unexpected keyword argument \\'chunked\\'\ '"}
Fixed with https://github.com/osism/testbed/pull/2285
Docker community galaxy packages currently offline
https://galaxy.ansible.com/api/v3/plugin/ansible/content/published/collections/artifacts/community-docker-3.10.4.tar.gz
I also tried it from https://galaxy.ansible.com/ui/repo/published/community/docker/ ... Appears the file does not exist anymore on AWS .... Hopefully this is just temporary.
Packages are available again as it seems
Most of the Ubuntu 24.04 stuff now seems to get through, but I'm still stuck at this point for some reason:
TASK [nova-cell : Check nova keyring file] ************************************* Wednesday 10 July 2024 12:32:23 +0000 (0:00:02.045) 0:05:57.703 ******** fatal: [testbed-node-0.testbed.osism.xyz -> localhost]: FAILED! => {"msg": "No file was found when using first_found."} fatal: [testbed-node-1.testbed.osism.xyz -> localhost]: FAILED! => {"msg": "No file was found when using first_found."} fatal: [testbed-node-2.testbed.osism.xyz -> localhost]: FAILED! => {"msg": "No file was found when using first_found."}
Most of the Ubuntu 24.04 stuff now seems to get through, but I'm still stuck at this point for some reason:
TASK [nova-cell : Check nova keyring file] ************************************* Wednesday 10 July 2024 12:32:23 +0000 (0:00:02.045) 0:05:57.703 ******** fatal: [testbed-node-0.testbed.osism.xyz -> localhost]: FAILED! => {"msg": "No file was found when using first_found."} fatal: [testbed-node-1.testbed.osism.xyz -> localhost]: FAILED! => {"msg": "No file was found when using first_found."} fatal: [testbed-node-2.testbed.osism.xyz -> localhost]: FAILED! => {"msg": "No file was found when using first_found."}
Ceph is deployed? osism apply copy-ceph-keys worked without issues?
I just also retried that, and it basically fails at:
TASK [Fetch ceph keys from the first monitor node]
with:
cannot access '/etc/ceph/*.keyring': No such file or directory
Also see full log below:
TASK [Fetch ceph keys from the first monitor node] *****************************
Wednesday 10 July 2024 12:49:47 +0000 (0:00:04.029) 0:00:21.084 ********
STILL ALIVE [task 'Fetch ceph keys from the first monitor node' is running] ****
STILL ALIVE [task 'Fetch ceph keys from the first monitor node' is running] ****
...
fatal: [testbed-manager.testbed.osism.xyz]: FAILED! => {"changed": true, "cmd": ["osism", "apply", "ceph-fetch-keys"], "delta": "0:03:48.446231", "end": "2024-07-10 12:53:36.615680", "msg": "non-zero return code", "rc": 2, "start": "2024-07-10 12:49:48.169449", "stderr": "\u001b[32m2024-07-10 12:49:49\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[1mTask c2f985fa-bfd0-48dc-801d-16ee43cf2750 (ceph-fetch-keys) was prepared for execution.\u001b[0m
\u001b[32m2024-07-10 12:49:49\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[1mIt takes a moment until task c2f985fa-bfd0-48dc-801d-16ee43cf2750 (ceph-fetch-keys) has been started and output is visible here.\u001b[0m
\u001b[32m2024-07-10 12:53:36\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[1mPlay has been completed. There may now be a delay until all logs have been written.\u001b[0m
\u001b[32m2024-07-10 12:53:36\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[1mPlease wait and do not abort execution.\u001b[0m", "stderr_lines": ["\u001b[32m2024-07-10 12:49:49\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[1mTask c2f985fa-bfd0-48dc-801d-16ee43cf2750 (ceph-fetch-keys) was prepared for execution.\u001b[0m", "\u001b[32m2024-07-10 12:49:49\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[1mIt takes a moment until task c2f985fa-bfd0-48dc-801d-16ee43cf2750 (ceph-fetch-keys) has been started and output is visible here.\u001b[0m", "\u001b[32m2024-07-10 12:53:36\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[1mPlay has been completed. There may now be a delay until all logs have been written.\u001b[0m", "\u001b[32m2024-07-10 12:53:36\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[1mPlease wait and do not abort execution.\u001b[0m"], "stdout": "
PLAY [Apply role fetch-keys] ***************************************************
TASK [ceph-facts : include_tasks convert_grafana_server_group_name.yml] ********
Wednesday 10 July 2024 12:49:57 +0000 (0:00:05.524) 0:00:05.524 ********
\u001b[0;36mincluded: /ansible/roles/ceph-facts/tasks/convert_grafana_server_group_name.yml for testbed-node-0.testbed.osism.xyz\u001b[0m
TASK [ceph-facts : convert grafana-server group name if exist] *****************
Wednesday 10 July 2024 12:50:03 +0000 (0:00:05.743) 0:00:11.267 ********
\u001b[0;33mchanged: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m
\u001b[0;33mchanged: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m
\u001b[0;33mchanged: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m
TASK [ceph-facts : include facts.yml] ******************************************
Wednesday 10 July 2024 12:50:06 +0000 (0:00:02.959) 0:00:14.227 ********
\u001b[0;36mincluded: /ansible/roles/ceph-facts/tasks/facts.yml for testbed-node-0.testbed.osism.xyz\u001b[0m
TASK [ceph-facts : check if it is atomic host] *********************************
Wednesday 10 July 2024 12:50:09 +0000 (0:00:03.132) 0:00:17.360 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact is_atomic] *****************************************
Wednesday 10 July 2024 12:50:12 +0000 (0:00:02.881) 0:00:20.242 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : check if podman binary is present] **************************
Wednesday 10 July 2024 12:50:14 +0000 (0:00:02.430) 0:00:22.672 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact container_binary] **********************************
Wednesday 10 July 2024 12:50:16 +0000 (0:00:01.882) 0:00:24.555 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact ceph_cmd] ******************************************
Wednesday 10 July 2024 12:50:18 +0000 (0:00:01.791) 0:00:26.346 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact discovered_interpreter_python] *********************
Wednesday 10 July 2024 12:50:19 +0000 (0:00:01.325) 0:00:27.672 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact discovered_interpreter_python if not previously set] ***
Wednesday 10 July 2024 12:50:20 +0000 (0:00:01.079) 0:00:28.751 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact ceph_release ceph_stable_release] ******************
Wednesday 10 July 2024 12:50:22 +0000 (0:00:01.223) 0:00:29.975 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact monitor_name ansible_facts['hostname']] ************
Wednesday 10 July 2024 12:50:23 +0000 (0:00:01.547) 0:00:31.523 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-1.testbed.osism.xyz(192.168.16.11)] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-2.testbed.osism.xyz(192.168.16.12)] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m
TASK [ceph-facts : set_fact container_exec_cmd] ********************************
Wednesday 10 July 2024 12:50:28 +0000 (0:00:05.134) 0:00:36.657 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : find a running mon container] *******************************
Wednesday 10 July 2024 12:50:31 +0000 (0:00:02.633) 0:00:39.290 ********
\u001b[0;33mchanged: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m
\u001b[0;33mchanged: [testbed-node-0.testbed.osism.xyz -> testbed-node-1.testbed.osism.xyz(192.168.16.11)] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m
\u001b[0;33mchanged: [testbed-node-0.testbed.osism.xyz -> testbed-node-2.testbed.osism.xyz(192.168.16.12)] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m
TASK [ceph-facts : check for a ceph mon socket] ********************************
Wednesday 10 July 2024 12:50:38 +0000 (0:00:07.209) 0:00:46.499 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : check if the ceph mon socket is in-use] *********************
Wednesday 10 July 2024 12:50:41 +0000 (0:00:02.533) 0:00:49.033 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0.testbed.osism.xyz', 'ansible_loop_var': 'item'}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1.testbed.osism.xyz', 'ansible_loop_var': 'item'}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2.testbed.osism.xyz', 'ansible_loop_var': 'item'}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact running_mon - non_container] ***********************
Wednesday 10 July 2024 12:50:48 +0000 (0:00:07.547) 0:00:56.581 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0.testbed.osism.xyz', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1.testbed.osism.xyz', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2.testbed.osism.xyz', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact running_mon - container] ***************************
Wednesday 10 July 2024 12:50:50 +0000 (0:00:02.048) 0:00:58.630 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': True, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-0'], 'start': '2024-07-10 12:50:32.289726', 'end': '2024-07-10 12:50:32.336045', 'delta': '0:00:00.046319', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-0', '_uses_shell': False, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-0.testbed.osism.xyz', 'ansible_loop_var': 'item'}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': True, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-1'], 'start': '2024-07-10 12:50:33.315529', 'end': '2024-07-10 12:50:33.359787', 'delta': '0:00:00.044258', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-1', '_uses_shell': False, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-1.testbed.osism.xyz', 'ansible_loop_var': 'item'}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': True, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-2'], 'start': '2024-07-10 12:50:34.277435', 'end': '2024-07-10 12:50:34.318868', 'delta': '0:00:00.041433', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-2', '_uses_shell': False, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-2.testbed.osism.xyz', 'ansible_loop_var': 'item'}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact _container_exec_cmd] *******************************
Wednesday 10 July 2024 12:50:52 +0000 (0:00:01.266) 0:00:59.897 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : get current fsid if cluster is already running] *************
Wednesday 10 July 2024 12:50:53 +0000 (0:00:01.506) 0:01:01.403 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact current_fsid rc 1] *********************************
Wednesday 10 July 2024 12:50:55 +0000 (0:00:01.783) 0:01:03.187 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : get current fsid] *******************************************
Wednesday 10 July 2024 12:50:56 +0000 (0:00:01.396) 0:01:04.583 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact fsid] **********************************************
Wednesday 10 July 2024 12:50:58 +0000 (0:00:01.289) 0:01:05.873 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact fsid from current_fsid] ****************************
Wednesday 10 July 2024 12:50:59 +0000 (0:00:01.340) 0:01:07.213 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : generate cluster fsid] **************************************
Wednesday 10 July 2024 12:51:00 +0000 (0:00:01.516) 0:01:08.730 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact fsid] **********************************************
Wednesday 10 July 2024 12:51:02 +0000 (0:00:01.977) 0:01:10.707 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : resolve device link(s)] *************************************
Wednesday 10 July 2024 12:51:05 +0000 (0:00:02.411) 0:01:13.119 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=/dev/sdb)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=/dev/sdc)\u001b[0m
TASK [ceph-facts : set_fact build devices from resolved symlinks] **************
Wednesday 10 July 2024 12:51:07 +0000 (0:00:01.826) 0:01:14.946 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : resolve dedicated_device link(s)] ***************************
Wednesday 10 July 2024 12:51:08 +0000 (0:00:01.540) 0:01:16.486 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact build dedicated_devices from resolved symlinks] ****
Wednesday 10 July 2024 12:51:09 +0000 (0:00:01.258) 0:01:17.745 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : resolve bluestore_wal_device link(s)] ***********************
Wednesday 10 July 2024 12:51:10 +0000 (0:00:01.087) 0:01:18.832 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact build bluestore_wal_devices from resolved symlinks] ***
Wednesday 10 July 2024 12:51:12 +0000 (0:00:01.279) 0:01:20.112 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact devices generate device list when osd_auto_discovery] ***
Wednesday 10 July 2024 12:51:14 +0000 (0:00:02.238) 0:01:22.351 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['292df295-6123-4df4-a1ea-94a5144d6ca2']}, 'sectors': '207615967', 'sectorsize': 512, 'size': '99.00 GB', 'start': '2099200', 'uuid': '292df295-6123-4df4-a1ea-94a5144d6ca2'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': '8192', 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['580A-6EEA']}, 'sectors': '217088', 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '580A-6EEA'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['57220a8e-9299-4242-9b68-246ed35c3801']}, 'sectors': '1869825', 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '57220a8e-9299-4242-9b68-246ed35c3801'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '209715200', 'sectorsize': '512', 'size': '100.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'sdb', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47448fb6-d97c-489d-a357-54ffb5c84fb4', 'scsi-SQEMU_QEMU_HARDDISK_47448fb6-d97c-489d-a357-54ffb5c84fb4'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '41943040', 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'sdc', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_b7901941-cad4-486e-828e-611da76af981', 'scsi-SQEMU_QEMU_HARDDISK_b7901941-cad4-486e-828e-611da76af981'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '41943040', 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_e130056b-7b91-469e-b527-5404caa57bad', 'scsi-SQEMU_QEMU_HARDDISK_e130056b-7b91-469e-b527-5404caa57bad'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '41943040', 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2024-07-08-08-40-48-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': '1012', 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '2048', 'vendor': 'QEMU', 'virtual': 1}}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : get ceph current status] ************************************
Wednesday 10 July 2024 12:51:17 +0000 (0:00:03.395) 0:01:25.747 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact ceph_current_status] *******************************
Wednesday 10 July 2024 12:51:22 +0000 (0:00:04.756) 0:01:30.504 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact rgw_hostname] **************************************
Wednesday 10 July 2024 12:51:25 +0000 (0:00:03.156) 0:01:33.661 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : check if the ceph conf exists] ******************************
Wednesday 10 July 2024 12:51:29 +0000 (0:00:03.678) 0:01:37.339 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set default osd_pool_default_crush_rule fact] ***************
Wednesday 10 July 2024 12:51:34 +0000 (0:00:05.289) 0:01:42.629 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : read osd pool default crush rule] ***************************
Wednesday 10 July 2024 12:51:36 +0000 (0:00:02.121) 0:01:44.750 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set osd_pool_default_crush_rule fact] ***********************
Wednesday 10 July 2024 12:51:39 +0000 (0:00:02.750) 0:01:47.500 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : read osd pool default crush rule] ***************************
Wednesday 10 July 2024 12:51:41 +0000 (0:00:01.466) 0:01:48.966 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set osd_pool_default_crush_rule fact] ***********************
Wednesday 10 July 2024 12:51:43 +0000 (0:00:01.954) 0:01:50.921 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact _monitor_addresses to monitor_address_block ipv4] ***
Wednesday 10 July 2024 12:51:44 +0000 (0:00:01.870) 0:01:52.791 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact _monitor_addresses to monitor_address_block ipv6] ***
Wednesday 10 July 2024 12:51:46 +0000 (0:00:01.595) 0:01:54.386 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact _monitor_addresses to monitor_address] *************
Wednesday 10 July 2024 12:51:48 +0000 (0:00:01.839) 0:01:56.226 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m
TASK [ceph-facts : set_fact _monitor_addresses to monitor_interface - ipv4] ****
Wednesday 10 July 2024 12:51:50 +0000 (0:00:02.441) 0:01:58.667 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact _monitor_addresses to monitor_interface - ipv6] ****
Wednesday 10 July 2024 12:51:51 +0000 (0:00:01.111) 0:01:59.778 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact _current_monitor_address] **************************
Wednesday 10 July 2024 12:51:53 +0000 (0:00:01.287) 0:02:01.066 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item={'name': 'testbed-node-0.testbed.osism.xyz', 'addr': '192.168.16.10'})\u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'name': 'testbed-node-1.testbed.osism.xyz', 'addr': '192.168.16.11'}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'name': 'testbed-node-2.testbed.osism.xyz', 'addr': '192.168.16.12'}) \u001b[0m
TASK [ceph-facts : import_tasks set_radosgw_address.yml] ***********************
Wednesday 10 July 2024 12:51:54 +0000 (0:00:01.748) 0:02:02.815 ********
\u001b[0;36mincluded: /ansible/roles/ceph-facts/tasks/set_radosgw_address.yml for testbed-node-0.testbed.osism.xyz\u001b[0m
TASK [ceph-facts : set current radosgw_address_block, radosgw_address, radosgw_interface from node \"{{ ceph_dashboard_call_item }}\"] ***
Wednesday 10 July 2024 12:51:58 +0000 (0:00:03.282) 0:02:06.097 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact _radosgw_address to radosgw_address_block ipv4] ****
Wednesday 10 July 2024 12:52:02 +0000 (0:00:04.334) 0:02:10.432 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact _radosgw_address to radosgw_address_block ipv6] ****
Wednesday 10 July 2024 12:52:04 +0000 (0:00:01.508) 0:02:11.941 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact _radosgw_address to radosgw_address] ***************
Wednesday 10 July 2024 12:52:06 +0000 (0:00:02.331) 0:02:14.273 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact _interface] ****************************************
Wednesday 10 July 2024 12:52:08 +0000 (0:00:02.562) 0:02:16.835 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact _radosgw_address to radosgw_interface - ipv4] ******
Wednesday 10 July 2024 12:52:11 +0000 (0:00:02.561) 0:02:19.397 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact _radosgw_address to radosgw_interface - ipv6] ******
Wednesday 10 July 2024 12:52:15 +0000 (0:00:04.349) 0:02:23.746 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : reset rgw_instances (workaround)] ***************************
Wednesday 10 July 2024 12:52:19 +0000 (0:00:03.631) 0:02:27.377 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact rgw_instances without rgw multisite] ***************
Wednesday 10 July 2024 12:52:23 +0000 (0:00:03.714) 0:02:31.091 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=0)\u001b[0m
TASK [ceph-facts : set_fact is_rgw_instances_defined] **************************
Wednesday 10 July 2024 12:52:27 +0000 (0:00:03.775) 0:02:34.867 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : reset rgw_instances (workaround)] ***************************
Wednesday 10 July 2024 12:52:30 +0000 (0:00:03.733) 0:02:38.600 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact rgw_instances with rgw multisite] ******************
Wednesday 10 July 2024 12:52:36 +0000 (0:00:05.990) 0:02:44.591 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=0) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact rgw_instances_host] ********************************
Wednesday 10 July 2024 12:52:43 +0000 (0:00:06.489) 0:02:51.080 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.10', 'radosgw_frontend_port': 8081}) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact rgw_instances_all] *********************************
Wednesday 10 July 2024 12:52:47 +0000 (0:00:04.050) 0:02:55.131 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact use_new_ceph_iscsi package or old ceph-iscsi-config/cli] ***
Wednesday 10 July 2024 12:52:52 +0000 (0:00:05.597) 0:03:00.728 ********
\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m
TASK [ceph-facts : set_fact ceph_run_cmd] **************************************
Wednesday 10 July 2024 12:52:59 +0000 (0:00:06.207) 0:03:06.936 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-1.testbed.osism.xyz(192.168.16.11)] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-2.testbed.osism.xyz(192.168.16.12)] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-1.testbed.osism.xyz(192.168.16.11)] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-2.testbed.osism.xyz(192.168.16.12)] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-manager.testbed.osism.xyz(192.168.16.5)] => (item=testbed-manager.testbed.osism.xyz)\u001b[0m
TASK [ceph-facts : set_fact ceph_admin_command] ********************************
Wednesday 10 July 2024 12:53:12 +0000 (0:00:12.919) 0:03:19.856 ********
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-1.testbed.osism.xyz(192.168.16.11)] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-2.testbed.osism.xyz(192.168.16.12)] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-1.testbed.osism.xyz(192.168.16.11)] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-2.testbed.osism.xyz(192.168.16.12)] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m
\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-manager.testbed.osism.xyz(192.168.16.5)] => (item=testbed-manager.testbed.osism.xyz)\u001b[0m
TASK [ceph-fetch-keys : lookup keys in /etc/ceph] ******************************
Wednesday 10 July 2024 12:53:25 +0000 (0:00:13.651) 0:03:33.507 ********
\u001b[0;31mfatal: [testbed-node-0.testbed.osism.xyz]: FAILED! => {\"changed\": false, \"cmd\": \"ls -1 /etc/ceph/*.keyring\", \"delta\": \"0:00:00.011581\", \"end\": \"2024-07-10 12:53:26.057291\", \"msg\": \"non-zero return code\", \"rc\": 2, \"start\": \"2024-07-10 12:53:26.045710\", \"stderr\": \"ls: cannot access '/etc/ceph/*.keyring': No such file or directory\", \"stderr_lines\": [\"ls: cannot access '/etc/ceph/*.keyring': No such file or directory\"], \"stdout\": \"\", \"stdout_lines\": []}\u001b[0m
PLAY RECAP *********************************************************************
\u001b[0;31mtestbed-node-0.testbed.osism.xyz\u001b[0m : \u001b[0;32mok=28 \u001b[0m \u001b[0;33mchanged=2 \u001b[0m unreachable=0 \u001b[0;31mfailed=1 \u001b[0m \u001b[0;36mskipped=38 \u001b[0m rescued=0 ignored=0
Wednesday 10 July 2024 12:53:36 +0000 (0:00:10.537) 0:03:44.045 ********
===============================================================================
ceph-facts : set_fact ceph_admin_command ------------------------------- 13.65s
ceph-facts : set_fact ceph_run_cmd ------------------------------------- 12.92s
ceph-fetch-keys : lookup keys in /etc/ceph ----------------------------- 10.54s
ceph-facts : check if the ceph mon socket is in-use --------------------- 7.55s
ceph-facts : find a running mon container ------------------------------- 7.21s
ceph-facts : set_fact rgw_instances with rgw multisite ------------------ 6.49s
ceph-facts : set_fact use_new_ceph_iscsi package or old ceph-iscsi-config/cli --- 6.21s
ceph-facts : reset rgw_instances (workaround) --------------------------- 5.99s
ceph-facts : include_tasks convert_grafana_server_group_name.yml -------- 5.74s
ceph-facts : set_fact rgw_instances_all --------------------------------- 5.60s
ceph-facts : check if the ceph conf exists ------------------------------ 5.29s
ceph-facts : set_fact monitor_name ansible_facts['hostname'] ------------ 5.13s
ceph-facts : get ceph current status ------------------------------------ 4.76s
ceph-facts : set_fact _radosgw_address to radosgw_interface - ipv4 ------ 4.35s
ceph-facts : set current radosgw_address_block, radosgw_address, radosgw_interface from node \"{{ ceph_dashboard_call_item }}\" --- 4.33s
ceph-facts : set_fact rgw_instances_host -------------------------------- 4.05s
ceph-facts : set_fact rgw_instances without rgw multisite --------------- 3.78s
ceph-facts : set_fact is_rgw_instances_defined -------------------------- 3.73s
ceph-facts : reset rgw_instances (workaround) --------------------------- 3.71s
ceph-facts : set_fact rgw_hostname -------------------------------------- 3.68s", "stdout_lines": ["", "PLAY [Apply role fetch-keys] ***************************************************", "", "TASK [ceph-facts : include_tasks convert_grafana_server_group_name.yml] ********", "Wednesday 10 July 2024 12:49:57 +0000 (0:00:05.524) 0:00:05.524 ******** ", "\u001b[0;36mincluded: /ansible/roles/ceph-facts/tasks/convert_grafana_server_group_name.yml for testbed-node-0.testbed.osism.xyz\u001b[0m", "", "TASK [ceph-facts : convert grafana-server group name if exist] *****************", "Wednesday 10 July 2024 12:50:03 +0000 (0:00:05.743) 0:00:11.267 ******** ", "\u001b[0;33mchanged: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m", "\u001b[0;33mchanged: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m", "\u001b[0;33mchanged: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m", "", "TASK [ceph-facts : include facts.yml] ******************************************", "Wednesday 10 July 2024 12:50:06 +0000 (0:00:02.959) 0:00:14.227 ******** ", "\u001b[0;36mincluded: /ansible/roles/ceph-facts/tasks/facts.yml for testbed-node-0.testbed.osism.xyz\u001b[0m", "", "TASK [ceph-facts : check if it is atomic host] *********************************", "Wednesday 10 July 2024 12:50:09 +0000 (0:00:03.132) 0:00:17.360 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact is_atomic] *****************************************", "Wednesday 10 July 2024 12:50:12 +0000 (0:00:02.881) 0:00:20.242 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : check if podman binary is present] **************************", "Wednesday 10 July 2024 12:50:14 +0000 (0:00:02.430) 0:00:22.672 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact container_binary] **********************************", "Wednesday 10 July 2024 12:50:16 +0000 (0:00:01.882) 0:00:24.555 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact ceph_cmd] ******************************************", "Wednesday 10 July 2024 12:50:18 +0000 (0:00:01.791) 0:00:26.346 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact discovered_interpreter_python] *********************", "Wednesday 10 July 2024 12:50:19 +0000 (0:00:01.325) 0:00:27.672 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact discovered_interpreter_python if not previously set] ***", "Wednesday 10 July 2024 12:50:20 +0000 (0:00:01.079) 0:00:28.751 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact ceph_release ceph_stable_release] ******************", "Wednesday 10 July 2024 12:50:22 +0000 (0:00:01.223) 0:00:29.975 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact monitor_name ansible_facts['hostname']] ************", "Wednesday 10 July 2024 12:50:23 +0000 (0:00:01.547) 0:00:31.523 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-1.testbed.osism.xyz(192.168.16.11)] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-2.testbed.osism.xyz(192.168.16.12)] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m", "", "TASK [ceph-facts : set_fact container_exec_cmd] ********************************", "Wednesday 10 July 2024 12:50:28 +0000 (0:00:05.134) 0:00:36.657 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : find a running mon container] *******************************", "Wednesday 10 July 2024 12:50:31 +0000 (0:00:02.633) 0:00:39.290 ******** ", "\u001b[0;33mchanged: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m", "\u001b[0;33mchanged: [testbed-node-0.testbed.osism.xyz -> testbed-node-1.testbed.osism.xyz(192.168.16.11)] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m", "\u001b[0;33mchanged: [testbed-node-0.testbed.osism.xyz -> testbed-node-2.testbed.osism.xyz(192.168.16.12)] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m", "", "TASK [ceph-facts : check for a ceph mon socket] ********************************", "Wednesday 10 July 2024 12:50:38 +0000 (0:00:07.209) 0:00:46.499 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : check if the ceph mon socket is in-use] *********************", "Wednesday 10 July 2024 12:50:41 +0000 (0:00:02.533) 0:00:49.033 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0.testbed.osism.xyz', 'ansible_loop_var': 'item'}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1.testbed.osism.xyz', 'ansible_loop_var': 'item'}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2.testbed.osism.xyz', 'ansible_loop_var': 'item'}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact running_mon - non_container] ***********************", "Wednesday 10 July 2024 12:50:48 +0000 (0:00:07.547) 0:00:56.581 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0.testbed.osism.xyz', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1.testbed.osism.xyz', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2.testbed.osism.xyz', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact running_mon - container] ***************************", "Wednesday 10 July 2024 12:50:50 +0000 (0:00:02.048) 0:00:58.630 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': True, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-0'], 'start': '2024-07-10 12:50:32.289726', 'end': '2024-07-10 12:50:32.336045', 'delta': '0:00:00.046319', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-0', '_uses_shell': False, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-0.testbed.osism.xyz', 'ansible_loop_var': 'item'}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': True, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-1'], 'start': '2024-07-10 12:50:33.315529', 'end': '2024-07-10 12:50:33.359787', 'delta': '0:00:00.044258', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-1', '_uses_shell': False, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-1.testbed.osism.xyz', 'ansible_loop_var': 'item'}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'changed': True, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-2'], 'start': '2024-07-10 12:50:34.277435', 'end': '2024-07-10 12:50:34.318868', 'delta': '0:00:00.041433', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-2', '_uses_shell': False, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-2.testbed.osism.xyz', 'ansible_loop_var': 'item'}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact _container_exec_cmd] *******************************", "Wednesday 10 July 2024 12:50:52 +0000 (0:00:01.266) 0:00:59.897 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : get current fsid if cluster is already running] *************", "Wednesday 10 July 2024 12:50:53 +0000 (0:00:01.506) 0:01:01.403 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact current_fsid rc 1] *********************************", "Wednesday 10 July 2024 12:50:55 +0000 (0:00:01.783) 0:01:03.187 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : get current fsid] *******************************************", "Wednesday 10 July 2024 12:50:56 +0000 (0:00:01.396) 0:01:04.583 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact fsid] **********************************************", "Wednesday 10 July 2024 12:50:58 +0000 (0:00:01.289) 0:01:05.873 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact fsid from current_fsid] ****************************", "Wednesday 10 July 2024 12:50:59 +0000 (0:00:01.340) 0:01:07.213 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : generate cluster fsid] **************************************", "Wednesday 10 July 2024 12:51:00 +0000 (0:00:01.516) 0:01:08.730 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact fsid] **********************************************", "Wednesday 10 July 2024 12:51:02 +0000 (0:00:01.977) 0:01:10.707 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : resolve device link(s)] *************************************", "Wednesday 10 July 2024 12:51:05 +0000 (0:00:02.411) 0:01:13.119 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=/dev/sdb)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=/dev/sdc)\u001b[0m", "", "TASK [ceph-facts : set_fact build devices from resolved symlinks] **************", "Wednesday 10 July 2024 12:51:07 +0000 (0:00:01.826) 0:01:14.946 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : resolve dedicated_device link(s)] ***************************", "Wednesday 10 July 2024 12:51:08 +0000 (0:00:01.540) 0:01:16.486 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact build dedicated_devices from resolved symlinks] ****", "Wednesday 10 July 2024 12:51:09 +0000 (0:00:01.258) 0:01:17.745 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : resolve bluestore_wal_device link(s)] ***********************", "Wednesday 10 July 2024 12:51:10 +0000 (0:00:01.087) 0:01:18.832 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact build bluestore_wal_devices from resolved symlinks] ***", "Wednesday 10 July 2024 12:51:12 +0000 (0:00:01.279) 0:01:20.112 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact devices generate device list when osd_auto_discovery] ***", "Wednesday 10 July 2024 12:51:14 +0000 (0:00:02.238) 0:01:22.351 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '0', 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '512', 'vendor': None, 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['292df295-6123-4df4-a1ea-94a5144d6ca2']}, 'sectors': '207615967', 'sectorsize': 512, 'size': '99.00 GB', 'start': '2099200', 'uuid': '292df295-6123-4df4-a1ea-94a5144d6ca2'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': '8192', 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['580A-6EEA']}, 'sectors': '217088', 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '580A-6EEA'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['57220a8e-9299-4242-9b68-246ed35c3801']}, 'sectors': '1869825', 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '57220a8e-9299-4242-9b68-246ed35c3801'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '209715200', 'sectorsize': '512', 'size': '100.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'sdb', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47448fb6-d97c-489d-a357-54ffb5c84fb4', 'scsi-SQEMU_QEMU_HARDDISK_47448fb6-d97c-489d-a357-54ffb5c84fb4'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '41943040', 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'sdc', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_b7901941-cad4-486e-828e-611da76af981', 'scsi-SQEMU_QEMU_HARDDISK_b7901941-cad4-486e-828e-611da76af981'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '41943040', 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_e130056b-7b91-469e-b527-5404caa57bad', 'scsi-SQEMU_QEMU_HARDDISK_e130056b-7b91-469e-b527-5404caa57bad'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': '41943040', 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2024-07-08-08-40-48-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': '1012', 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '2048', 'vendor': 'QEMU', 'virtual': 1}}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : get ceph current status] ************************************", "Wednesday 10 July 2024 12:51:17 +0000 (0:00:03.395) 0:01:25.747 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact ceph_current_status] *******************************", "Wednesday 10 July 2024 12:51:22 +0000 (0:00:04.756) 0:01:30.504 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact rgw_hostname] **************************************", "Wednesday 10 July 2024 12:51:25 +0000 (0:00:03.156) 0:01:33.661 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : check if the ceph conf exists] ******************************", "Wednesday 10 July 2024 12:51:29 +0000 (0:00:03.678) 0:01:37.339 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set default osd_pool_default_crush_rule fact] ***************", "Wednesday 10 July 2024 12:51:34 +0000 (0:00:05.289) 0:01:42.629 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : read osd pool default crush rule] ***************************", "Wednesday 10 July 2024 12:51:36 +0000 (0:00:02.121) 0:01:44.750 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set osd_pool_default_crush_rule fact] ***********************", "Wednesday 10 July 2024 12:51:39 +0000 (0:00:02.750) 0:01:47.500 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : read osd pool default crush rule] ***************************", "Wednesday 10 July 2024 12:51:41 +0000 (0:00:01.466) 0:01:48.966 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set osd_pool_default_crush_rule fact] ***********************", "Wednesday 10 July 2024 12:51:43 +0000 (0:00:01.954) 0:01:50.921 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact _monitor_addresses to monitor_address_block ipv4] ***", "Wednesday 10 July 2024 12:51:44 +0000 (0:00:01.870) 0:01:52.791 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact _monitor_addresses to monitor_address_block ipv6] ***", "Wednesday 10 July 2024 12:51:46 +0000 (0:00:01.595) 0:01:54.386 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact _monitor_addresses to monitor_address] *************", "Wednesday 10 July 2024 12:51:48 +0000 (0:00:01.839) 0:01:56.226 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m", "", "TASK [ceph-facts : set_fact _monitor_addresses to monitor_interface - ipv4] ****", "Wednesday 10 July 2024 12:51:50 +0000 (0:00:02.441) 0:01:58.667 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact _monitor_addresses to monitor_interface - ipv6] ****", "Wednesday 10 July 2024 12:51:51 +0000 (0:00:01.111) 0:01:59.778 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact _current_monitor_address] **************************", "Wednesday 10 July 2024 12:51:53 +0000 (0:00:01.287) 0:02:01.066 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item={'name': 'testbed-node-0.testbed.osism.xyz', 'addr': '192.168.16.10'})\u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'name': 'testbed-node-1.testbed.osism.xyz', 'addr': '192.168.16.11'}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'name': 'testbed-node-2.testbed.osism.xyz', 'addr': '192.168.16.12'}) \u001b[0m", "", "TASK [ceph-facts : import_tasks set_radosgw_address.yml] ***********************", "Wednesday 10 July 2024 12:51:54 +0000 (0:00:01.748) 0:02:02.815 ******** ", "\u001b[0;36mincluded: /ansible/roles/ceph-facts/tasks/set_radosgw_address.yml for testbed-node-0.testbed.osism.xyz\u001b[0m", "", "TASK [ceph-facts : set current radosgw_address_block, radosgw_address, radosgw_interface from node \"{{ ceph_dashboard_call_item }}\"] ***", "Wednesday 10 July 2024 12:51:58 +0000 (0:00:03.282) 0:02:06.097 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact _radosgw_address to radosgw_address_block ipv4] ****", "Wednesday 10 July 2024 12:52:02 +0000 (0:00:04.334) 0:02:10.432 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact _radosgw_address to radosgw_address_block ipv6] ****", "Wednesday 10 July 2024 12:52:04 +0000 (0:00:01.508) 0:02:11.941 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact _radosgw_address to radosgw_address] ***************", "Wednesday 10 July 2024 12:52:06 +0000 (0:00:02.331) 0:02:14.273 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact _interface] ****************************************", "Wednesday 10 July 2024 12:52:08 +0000 (0:00:02.562) 0:02:16.835 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact _radosgw_address to radosgw_interface - ipv4] ******", "Wednesday 10 July 2024 12:52:11 +0000 (0:00:02.561) 0:02:19.397 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact _radosgw_address to radosgw_interface - ipv6] ******", "Wednesday 10 July 2024 12:52:15 +0000 (0:00:04.349) 0:02:23.746 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : reset rgw_instances (workaround)] ***************************", "Wednesday 10 July 2024 12:52:19 +0000 (0:00:03.631) 0:02:27.377 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact rgw_instances without rgw multisite] ***************", "Wednesday 10 July 2024 12:52:23 +0000 (0:00:03.714) 0:02:31.091 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=0)\u001b[0m", "", "TASK [ceph-facts : set_fact is_rgw_instances_defined] **************************", "Wednesday 10 July 2024 12:52:27 +0000 (0:00:03.775) 0:02:34.867 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : reset rgw_instances (workaround)] ***************************", "Wednesday 10 July 2024 12:52:30 +0000 (0:00:03.733) 0:02:38.600 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact rgw_instances with rgw multisite] ******************", "Wednesday 10 July 2024 12:52:36 +0000 (0:00:05.990) 0:02:44.591 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=0) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact rgw_instances_host] ********************************", "Wednesday 10 July 2024 12:52:43 +0000 (0:00:06.489) 0:02:51.080 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.10', 'radosgw_frontend_port': 8081}) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact rgw_instances_all] *********************************", "Wednesday 10 July 2024 12:52:47 +0000 (0:00:04.050) 0:02:55.131 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-1.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-2.testbed.osism.xyz) \u001b[0m", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact use_new_ceph_iscsi package or old ceph-iscsi-config/cli] ***", "Wednesday 10 July 2024 12:52:52 +0000 (0:00:05.597) 0:03:00.728 ******** ", "\u001b[0;36mskipping: [testbed-node-0.testbed.osism.xyz]\u001b[0m", "", "TASK [ceph-facts : set_fact ceph_run_cmd] **************************************", "Wednesday 10 July 2024 12:52:59 +0000 (0:00:06.207) 0:03:06.936 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-1.testbed.osism.xyz(192.168.16.11)] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-2.testbed.osism.xyz(192.168.16.12)] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-1.testbed.osism.xyz(192.168.16.11)] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-2.testbed.osism.xyz(192.168.16.12)] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-manager.testbed.osism.xyz(192.168.16.5)] => (item=testbed-manager.testbed.osism.xyz)\u001b[0m", "", "TASK [ceph-facts : set_fact ceph_admin_command] ********************************", "Wednesday 10 July 2024 12:53:12 +0000 (0:00:12.919) 0:03:19.856 ******** ", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-1.testbed.osism.xyz(192.168.16.11)] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-2.testbed.osism.xyz(192.168.16.12)] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz] => (item=testbed-node-0.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-1.testbed.osism.xyz(192.168.16.11)] => (item=testbed-node-1.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-node-2.testbed.osism.xyz(192.168.16.12)] => (item=testbed-node-2.testbed.osism.xyz)\u001b[0m", "\u001b[0;32mok: [testbed-node-0.testbed.osism.xyz -> testbed-manager.testbed.osism.xyz(192.168.16.5)] => (item=testbed-manager.testbed.osism.xyz)\u001b[0m", "", "TASK [ceph-fetch-keys : lookup keys in /etc/ceph] ******************************", "Wednesday 10 July 2024 12:53:25 +0000 (0:00:13.651) 0:03:33.507 ******** ", "\u001b[0;31mfatal: [testbed-node-0.testbed.osism.xyz]: FAILED! => {\"changed\": false, \"cmd\": \"ls -1 /etc/ceph/*.keyring\", \"delta\": \"0:00:00.011581\", \"end\": \"2024-07-10 12:53:26.057291\", \"msg\": \"non-zero return code\", \"rc\": 2, \"start\": \"2024-07-10 12:53:26.045710\", \"stderr\": \"ls: cannot access '/etc/ceph/*.keyring': No such file or directory\", \"stderr_lines\": [\"ls: cannot access '/etc/ceph/*.keyring': No such file or directory\"], \"stdout\": \"\", \"stdout_lines\": []}\u001b[0m", "", "PLAY RECAP *********************************************************************", "\u001b[0;31mtestbed-node-0.testbed.osism.xyz\u001b[0m : \u001b[0;32mok=28 \u001b[0m \u001b[0;33mchanged=2 \u001b[0m unreachable=0 \u001b[0;31mfailed=1 \u001b[0m \u001b[0;36mskipped=38 \u001b[0m rescued=0 ignored=0 ", "", "", "Wednesday 10 July 2024 12:53:36 +0000 (0:00:10.537) 0:03:44.045 ******** ", "=============================================================================== ", "ceph-facts : set_fact ceph_admin_command ------------------------------- 13.65s", "ceph-facts : set_fact ceph_run_cmd ------------------------------------- 12.92s", "ceph-fetch-keys : lookup keys in /etc/ceph ----------------------------- 10.54s", "ceph-facts : check if the ceph mon socket is in-use --------------------- 7.55s", "ceph-facts : find a running mon container ------------------------------- 7.21s", "ceph-facts : set_fact rgw_instances with rgw multisite ------------------ 6.49s", "ceph-facts : set_fact use_new_ceph_iscsi package or old ceph-iscsi-config/cli --- 6.21s", "ceph-facts : reset rgw_instances (workaround) --------------------------- 5.99s", "ceph-facts : include_tasks convert_grafana_server_group_name.yml -------- 5.74s", "ceph-facts : set_fact rgw_instances_all --------------------------------- 5.60s", "ceph-facts : check if the ceph conf exists ------------------------------ 5.29s", "ceph-facts : set_fact monitor_name ansible_facts['hostname'] ------------ 5.13s", "ceph-facts : get ceph current status ------------------------------------ 4.76s", "ceph-facts : set_fact _radosgw_address to radosgw_interface - ipv4 ------ 4.35s", "ceph-facts : set current radosgw_address_block, radosgw_address, radosgw_interface from node \"{{ ceph_dashboard_call_item }}\" --- 4.33s", "ceph-facts : set_fact rgw_instances_host -------------------------------- 4.05s", "ceph-facts : set_fact rgw_instances without rgw multisite --------------- 3.78s", "ceph-facts : set_fact is_rgw_instances_defined -------------------------- 3.73s", "ceph-facts : reset rgw_instances (workaround) --------------------------- 3.71s", "ceph-facts : set_fact rgw_hostname -------------------------------------- 3.68s"]}
Paste the output of docker ps of the testbed-node-0 node.
Please see below:
dragon@testbed-node-0:~$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
589b2d4352a2 nexus.testbed.osism.xyz:8192/osism/nova-api:2023.2 "dumb-init --single-…" About an hour ago Up About an hour (healthy) nova_api
c4f909cf5e20 nexus.testbed.osism.xyz:8192/osism/nova-scheduler:2023.2 "dumb-init --single-…" About an hour ago Up About an hour (healthy) nova_scheduler
0d2982cd7734 nexus.testbed.osism.xyz:8192/osism/ironic-neutron-agent:2023.2 "dumb-init --single-…" About an hour ago Up 3 seconds (health: starting) ironic_neutron_agent
32e117bdfc6f nexus.testbed.osism.xyz:8192/osism/neutron-metadata-agent:2023.2 "dumb-init --single-…" About an hour ago Up About an hour (healthy) neutron_ovn_metadata_agent
35b8e0551ffc nexus.testbed.osism.xyz:8192/osism/neutron-server:2023.2 "dumb-init --single-…" About an hour ago Up About an hour (healthy) neutron_server
01bf673bf8ee nexus.testbed.osism.xyz:8192/osism/placement-api:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) placement_api
971eb0eef7e4 nexus.testbed.osism.xyz:8192/osism/keystone:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) keystone
201cbf0f0144 nexus.testbed.osism.xyz:8192/osism/keystone-fernet:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) keystone_fernet
a5a898b55714 nexus.testbed.osism.xyz:8192/osism/keystone-ssh:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) keystone_ssh
855b0d2b0e58 nexus.testbed.osism.xyz:8192/osism/opensearch-dashboards:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) opensearch_dashboards
b0318c844685 nexus.testbed.osism.xyz:8192/osism/opensearch:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) opensearch
ba36782a9f3c nexus.testbed.osism.xyz:8192/osism/ovn-northd:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours ovn_northd
13c1cde84eb1 nexus.testbed.osism.xyz:8192/osism/ovn-sb-db-server:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours ovn_sb_db
f0ffaba1bdc3 nexus.testbed.osism.xyz:8192/osism/ovn-nb-db-server:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours ovn_nb_db
4a3d1ba88c98 nexus.testbed.osism.xyz:8192/osism/ovn-controller:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours ovn_controller
9fd5e40f76d1 nexus.testbed.osism.xyz:8192/osism/openvswitch-vswitchd:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) openvswitch_vswitchd
ac8af9f46f56 nexus.testbed.osism.xyz:8192/osism/openvswitch-db-server:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) openvswitch_db
e6701135fffd nexus.testbed.osism.xyz:8192/osism/rabbitmq:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) rabbitmq
0f3f9e75a4fa nexus.testbed.osism.xyz:8192/osism/mariadb-server:2023.2 "dumb-init -- kolla_…" 2 hours ago Up 2 hours (healthy) mariadb
e83dca0e1890 nexus.testbed.osism.xyz:8192/osism/mariadb-clustercheck:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours mariadb_clustercheck
5a691d5ae8c1 nexus.testbed.osism.xyz:8192/osism/redis-sentinel:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) redis_sentinel
50bee926818b nexus.testbed.osism.xyz:8192/osism/redis:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) redis
768c5060a7bf nexus.testbed.osism.xyz:8192/osism/memcached:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) memcached
b45e0d77e01c nexus.testbed.osism.xyz:8192/osism/keepalived:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours keepalived
6bb915c2b979 nexus.testbed.osism.xyz:8192/osism/haproxy:2023.2 "dumb-init --single-…" 2 hours ago Up 2 hours (healthy) haproxy
c308c9e9970a nexus.testbed.osism.xyz:8192/osism/cron:2023.2 "dumb-init --single-…" 3 hours ago Up 3 hours cron
c2ccd4a0045a nexus.testbed.osism.xyz:8192/osism/kolla-toolbox:2023.2 "dumb-init --single-…" 3 hours ago Up 3 hours kolla_toolbox
fd0fd59af998 nexus.testbed.osism.xyz:8192/osism/fluentd:2023.2 "dumb-init --single-…" 3 hours ago Up 3 hours fluentd
It appears that:
0d2982cd7734 nexus.testbed.osism.xyz:8192/osism/ironic-neutron-agent:2023.2 "dumb-init --single-…" 2 hours ago Up 6 seconds (health: starting) ironic_neutron_agent
is flapping.
You have not deployed Ironic. Because of that the ironic-neutron-agent agent is flapping because it cannot reach the Ironic API.
Netbox runs into migration errors after start on manager node:
django.db.utils.ProgrammingError: there is no unique constraint matching given keys for referenced table "tenancy_tenant"
Applying ipam.0065_asnrange...⚙️ Applying database migrations
🧬 loaded config '/etc/netbox/config/configuration.py'
🧬 loaded config '/etc/netbox/config/extra.py'
🧬 loaded config '/etc/netbox/config/logging.py'
🧬 loaded config '/etc/netbox/config/plugins.py'
Operations to perform:
Apply all migrations: account, admin, auth, circuits, contenttypes, core, dcim, django_rq, extras, ipam, netbox_bgp, sessions, social_django, taggit, tenancy, users, virtualization, vpn, wireless
Running migrations:
Traceback (most recent call last):
File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 87, in _execute
return self.cursor.execute(sql)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/netbox/venv/lib/python3.11/site-packages/django_prometheus/db/common.py", line 69, in execute
return super().execute(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/netbox/venv/lib/python3.11/site-packages/psycopg/cursor.py", line 737, in execute
raise ex.with_traceback(None)
psycopg.errors.InvalidForeignKey: there is no unique constraint matching given keys for referenced table "tenancy_tenant"
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/opt/netbox/netbox/./manage.py", line 10, in <module>
execute_from_command_line(sys.argv)
File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/__init__.py", line 442, in execute_from_command_line
utility.execute()
File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/__init__.py", line 436, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/base.py", line 412, in run_from_argv
self.execute(*args, **cmd_options)
File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/base.py", line 458, in execute
output = self.handle(*args, **options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/base.py", line 106, in wrapper
res = handle_func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/commands/migrate.py", line 356, in handle
post_migrate_state = executor.migrate(
^^^^^^^^^^^^^^^^^
File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/migrations/executor.py", line 135, in migrate
state = self._migrate_all_forwards(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/migrations/executor.py", line 167, in _migrate_all_forwards
state = self.apply_migration(
^^^^^^^^^^^^^^^^^^^^^
File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/migrations/executor.py", line 249, in apply_migration
with self.connection.schema_editor(
File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/base/schema.py", line 166, in __exit__
self.execute(sql)
File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/postgresql/schema.py", line 48, in execute
return super().execute(sql, None)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/base/schema.py", line 201, in execute
cursor.execute(sql, params)
File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 67, in execute
return self._execute_with_wrappers(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 80, in _execute_with_wrappers
return executor(sql, params, many, context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 84, in _execute
with self.db.wrap_database_errors:
File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/utils.py", line 91, in __exit__
raise dj_exc_value.with_traceback(traceback) from exc_value
File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 87, in _execute
return self.cursor.execute(sql)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/netbox/venv/lib/python3.11/site-packages/django_prometheus/db/common.py", line 69, in execute
return super().execute(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/netbox/venv/lib/python3.11/site-packages/psycopg/cursor.py", line 737, in execute
raise ex.with_traceback(None)
Netbox runs into migration errors after start on manager node:
django.db.utils.ProgrammingError: there is no unique constraint matching given keys for referenced table "tenancy_tenant" Applying ipam.0065_asnrange...⚙️ Applying database migrations 🧬 loaded config '/etc/netbox/config/configuration.py' 🧬 loaded config '/etc/netbox/config/extra.py' 🧬 loaded config '/etc/netbox/config/logging.py' 🧬 loaded config '/etc/netbox/config/plugins.py' Operations to perform: Apply all migrations: account, admin, auth, circuits, contenttypes, core, dcim, django_rq, extras, ipam, netbox_bgp, sessions, social_django, taggit, tenancy, users, virtualization, vpn, wireless Running migrations: Traceback (most recent call last): File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 87, in _execute return self.cursor.execute(sql) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django_prometheus/db/common.py", line 69, in execute return super().execute(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/psycopg/cursor.py", line 737, in execute raise ex.with_traceback(None) psycopg.errors.InvalidForeignKey: there is no unique constraint matching given keys for referenced table "tenancy_tenant" The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/opt/netbox/netbox/./manage.py", line 10, in <module> execute_from_command_line(sys.argv) File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/__init__.py", line 442, in execute_from_command_line utility.execute() File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/__init__.py", line 436, in execute self.fetch_command(subcommand).run_from_argv(self.argv) File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/base.py", line 412, in run_from_argv self.execute(*args, **cmd_options) File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/base.py", line 458, in execute output = self.handle(*args, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/base.py", line 106, in wrapper res = handle_func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/commands/migrate.py", line 356, in handle post_migrate_state = executor.migrate( ^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/migrations/executor.py", line 135, in migrate state = self._migrate_all_forwards( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/migrations/executor.py", line 167, in _migrate_all_forwards state = self.apply_migration( ^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/migrations/executor.py", line 249, in apply_migration with self.connection.schema_editor( File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/base/schema.py", line 166, in __exit__ self.execute(sql) File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/postgresql/schema.py", line 48, in execute return super().execute(sql, None) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/base/schema.py", line 201, in execute cursor.execute(sql, params) File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 67, in execute return self._execute_with_wrappers( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 80, in _execute_with_wrappers return executor(sql, params, many, context) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 84, in _execute with self.db.wrap_database_errors: File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/utils.py", line 91, in __exit__ raise dj_exc_value.with_traceback(traceback) from exc_value File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 87, in _execute return self.cursor.execute(sql) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django_prometheus/db/common.py", line 69, in execute return super().execute(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/psycopg/cursor.py", line 737, in execute raise ex.with_traceback(None)
That's not related to Ubuntu 24.04 and should be fixed with https://github.com/osism/testbed/pull/2301.
Netbox runs into migration errors after start on manager node:
django.db.utils.ProgrammingError: there is no unique constraint matching given keys for referenced table "tenancy_tenant" Applying ipam.0065_asnrange...⚙️ Applying database migrations 🧬 loaded config '/etc/netbox/config/configuration.py' 🧬 loaded config '/etc/netbox/config/extra.py' 🧬 loaded config '/etc/netbox/config/logging.py' 🧬 loaded config '/etc/netbox/config/plugins.py' Operations to perform: Apply all migrations: account, admin, auth, circuits, contenttypes, core, dcim, django_rq, extras, ipam, netbox_bgp, sessions, social_django, taggit, tenancy, users, virtualization, vpn, wireless Running migrations: Traceback (most recent call last): File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 87, in _execute return self.cursor.execute(sql) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django_prometheus/db/common.py", line 69, in execute return super().execute(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/psycopg/cursor.py", line 737, in execute raise ex.with_traceback(None) psycopg.errors.InvalidForeignKey: there is no unique constraint matching given keys for referenced table "tenancy_tenant" The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/opt/netbox/netbox/./manage.py", line 10, in <module> execute_from_command_line(sys.argv) File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/__init__.py", line 442, in execute_from_command_line utility.execute() File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/__init__.py", line 436, in execute self.fetch_command(subcommand).run_from_argv(self.argv) File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/base.py", line 412, in run_from_argv self.execute(*args, **cmd_options) File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/base.py", line 458, in execute output = self.handle(*args, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/base.py", line 106, in wrapper res = handle_func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/core/management/commands/migrate.py", line 356, in handle post_migrate_state = executor.migrate( ^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/migrations/executor.py", line 135, in migrate state = self._migrate_all_forwards( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/migrations/executor.py", line 167, in _migrate_all_forwards state = self.apply_migration( ^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/migrations/executor.py", line 249, in apply_migration with self.connection.schema_editor( File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/base/schema.py", line 166, in __exit__ self.execute(sql) File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/postgresql/schema.py", line 48, in execute return super().execute(sql, None) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/base/schema.py", line 201, in execute cursor.execute(sql, params) File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 67, in execute return self._execute_with_wrappers( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 80, in _execute_with_wrappers return executor(sql, params, many, context) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 84, in _execute with self.db.wrap_database_errors: File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/utils.py", line 91, in __exit__ raise dj_exc_value.with_traceback(traceback) from exc_value File "/opt/netbox/venv/lib/python3.11/site-packages/django/db/backends/utils.py", line 87, in _execute return self.cursor.execute(sql) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/django_prometheus/db/common.py", line 69, in execute return super().execute(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/netbox/venv/lib/python3.11/site-packages/psycopg/cursor.py", line 737, in execute raise ex.with_traceback(None)That's not related to Ubuntu 24.04 and should be fixed with #2301.
Appears to be working fine again, thx.
I'm currently running into a recurring execution of:
TASK [k3s_agent : Manage k3s service]
using the deployment script at: /opt/configuration/scripts/deploy/005-kubernetes.sh
These Tasks then stack up while noting appears to happen:
osism task list
+----------------------+--------------------------------------+-------------------------+----------+----------------------------+--------------------------------------+
| Worker | ID | Name | Status | Start time | Arguments |
|----------------------+--------------------------------------+-------------------------+----------+----------------------------+--------------------------------------|
| celery@osism-ansible | eceba375-1fd0-4fad-8e7c-7e59110b44af | osism.tasks.ansible.run | ACTIVE | 2024-07-16 10:12:03.057228 | ['infrastructure', 'kubernetes', []] |
| celery@osism-ansible | 2157cecb-c89f-4a25-bb04-6cd8747eba25 | osism.tasks.ansible.run | ACTIVE | 2024-07-16 10:06:14.506749 | ['infrastructure', 'kubernetes', []] |
+----------------------+--------------------------------------+-------------------------+----------+----------------------------+--------------------------------------+
At the beginning of the script's output, I also get the following error on Ubuntu 24.04:
TASK [k3s_prereq : Set SELinux to disabled state] ******************************
Tuesday 16 July 2024 10:12:14 +0000 (0:00:04.827) 0:00:08.201 **********
skipping: [testbed-manager.testbed.osism.xyz]
fatal: [testbed-node-0.testbed.osism.xyz]: FAILED! =>
msg: |-
The conditional check 'ansible_os_family == "RedHat"' failed. The error was: error while evaluating conditional (ansible_os_family == "RedHat"): 'ansible_os_family' is undefined
The error appears to be in '/ansible/roles/k3s_prereq/tasks/main.yml': line 7, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
- name: Set SELinux to disabled state
^ here
fatal: [testbed-node-1.testbed.osism.xyz]: FAILED! =>
msg: |-
The conditional check 'ansible_os_family == "RedHat"' failed. The error was: error while evaluating conditional (ansible_os_family == "RedHat"): 'ansible_os_family' is undefined
The error appears to be in '/ansible/roles/k3s_prereq/tasks/main.yml': line 7, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
- name: Set SELinux to disabled state
^ here
fatal: [testbed-node-2.testbed.osism.xyz]: FAILED! =>
msg: |-
The conditional check 'ansible_os_family == "RedHat"' failed. The error was: error while evaluating conditional (ansible_os_family == "RedHat"): 'ansible_os_family' is undefined
The error appears to be in '/ansible/roles/k3s_prereq/tasks/main.yml': line 7, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
- name: Set SELinux to disabled state
^ here
To me, it's currently not clear, where the issue originates from, as I'm also not able to find a state called "Set SELinux to disabled state" across the osism repos.
Using Ubuntu 24.04 appears to get through the deployment without issues for now.
That's done. Ubuntu 24.04 now works.