ansible-devops icon indicating copy to clipboard operation
ansible-devops copied to clipboard

Maximo IoT unable to complete when Openshift Virtualization present in cluster

Open rene-oromtz opened this issue 11 months ago • 2 comments

Collection version

26.0.1

Environment information

quay.io/ibmmas/cli:13.2.0

What happened?

When OSV (Openshift Virtualization) is present in cluster (required for hosted control planes) there is an error present when attempting to query CR that do not have defined api_version.

Currently the installation of "edgeconfig" for IOT tool is failing when attempting to "Check for existing Job edgeconfig-djangomodel-edgeconfig-1.0.80"

Error is easily identified by ValueError: too many values to unpack (expected 2) from module_utils/client/discovery.py

File where the error resides: /opt/ansible/.ansible/collections/ansible_collections/iot/util/roles/modelsync/tasks/main.yml

Relevant log output

Error present when installing edgeconfig:

TASK [iot.util.modelsync : Check for existing Job edgeconfig-djangomodel-edgeconfig-1.0.80] ***
[1;30mtask path: /opt/ansible/.ansible/collections/ansible_collections/iot/util/roles/modelsync/tasks/main.yml:34[0m

-------------------------------------------------------------------------------
{"level":"info","ts":"2025-01-28T22:34:54Z","logger":"logging_event_handler","msg":"[playbook task start]","name":"maximus","namespace":"mas-maximus-iot","gvk":"components.iot.ibm.com/v1, Kind=Edgeconfig","event_type":"playbook_on_task_start","job":"6197677731908092509","EventData.Name":"iot.util.modelsync : Check for existing Job edgeconfig-djangomodel-edgeconfig-1.0.80"}

--------------------------- Ansible Task StdOut -------------------------------

 TASK [Check for existing Job edgeconfig-djangomodel-edgeconfig-1.0.80] ******************************** 
[0;31mAn exception occurred during task execution. To see the full traceback, use -vvv. The error was: ValueError: too many values to unpack (expected 2)[0m
[0;31mfatal: [localhost]: FAILED! => {"changed": false, "module_stderr": "Traceback (most recent call last):\n  File \"/opt/ansible/.ansible/tmp/ansible-tmp-1738103694.7399697-8446-16219889379122/AnsiballZ_k8s_info.py\", line 107, in <module>\n    _ansiballz_main()\n  File \"/opt/ansible/.ansible/tmp/ansible-tmp-1738103694.7399697-8446-16219889379122/AnsiballZ_k8s_info.py\", line 99, in _ansiballz_main\n    invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n  File \"/opt/ansible/.ansible/tmp/ansible-tmp-1738103694.7399697-8446-16219889379122/AnsiballZ_k8s_info.py\", line 47, in invoke_module\n    runpy.run_module(mod_name='ansible_collections.kubernetes.core.plugins.modules.k8s_info', init_globals=dict(_module_fqn='ansible_collections.kubernetes.core.plugins.modules.k8s_info', _modlib_path=modlib_path),\n  File \"/usr/lib64/python3.9/runpy.py\", line 225, in run_module\n    return _run_module_code(code, init_globals, run_name, mod_spec)\n  File \"/usr/lib64/python3.9/runpy.py\", line 97, in _run_module_code\n    _run_code(code, mod_globals, init_globals,\n  File \"/usr/lib64/python3.9/runpy.py\", line 87, in _run_code\n    exec(code, run_globals)\n  File \"/tmp/ansible_kubernetes.core.k8s_info_payload_ipqz572b/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/modules/k8s_info.py\", line 217, in <module>\n  File \"/tmp/ansible_kubernetes.core.k8s_info_payload_ipqz572b/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/modules/k8s_info.py\", line 211, in main\n  File \"/tmp/ansible_kubernetes.core.k8s_info_payload_ipqz572b/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/modules/k8s_info.py\", line 173, in execute_module\n  File \"/tmp/ansible_kubernetes.core.k8s_info_payload_ipqz572b/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/k8s/service.py\", line 228, in find\n  File \"/tmp/ansible_kubernetes.core.k8s_info_payload_ipqz572b/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/k8s/service.py\", line 90, in find_resource\n  File \"/tmp/ansible_kubernetes.core.k8s_info_payload_ipqz572b/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/k8s/client.py\", line 306, in resource\n  File \"/tmp/ansible_kubernetes.core.k8s_info_payload_ipqz572b/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/k8s/client.py\", line 286, in _find_resource_with_prefix\n  File \"/tmp/ansible_kubernetes.core.k8s_info_payload_ipqz572b/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/client/discovery.py\", line 159, in get\n  File \"/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/discovery.py\", line 246, in search\n    results = self.__search(self.__build_search(**kwargs), self.__resources, [])\n  File \"/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/discovery.py\", line 294, in __search\n    matches.extend(self.__search([key] + parts[1:], resources, reqParams))\n  File \"/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/discovery.py\", line 280, in __search\n    return self.__search(parts[1:], resourcePart, reqParams + [part] )\n  File \"/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/discovery.py\", line 294, in __search\n    matches.extend(self.__search([key] + parts[1:], resources, reqParams))\n  File \"/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/discovery.py\", line 280, in __search\n    return self.__search(parts[1:], resourcePart, reqParams + [part] )\n  File \"/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/discovery.py\", line 269, in __search\n    resourcePart.resources = self.get_resources_for_api_version(\n  File \"/tmp/ansible_kubernetes.core.k8s_info_payload_ipqz572b/ansible_kubernetes.core.k8s_info_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/client/discovery.py\", line 117, in get_resources_for_api_version\nValueError: too many values to unpack (expected 2)\n", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}[0m

-------------------------------------------------------------------------------


The issue can be easily fix by editing the before mentioned file:
/opt/ansible/.ansible/collections/ansible_collections/iot/util/roles/modelsync/tasks/main.yml


Sections to be modified to include api_version, for job, this api_version should be batch/v1:


- name: "Check for existing Job {{ job_name }}"
  kubernetes.core.k8s_info:
    kind: Job
    namespace: "{{ iotNamespace }}"
    name: "{{ job_name }}"
  register: existingJob

...

- name: "Remove existing Job {{ job_name }} if it has failed previously"
  kubernetes.core.k8s:
    state: absent
    kind: Job
    namespace: "{{ iotNamespace }}"
    name: "{{ job_name }}"
    delete_options:
      propagationPolicy: Foreground
  when:
    - existing_job_failed

...

- name: "Find any old {{ modelType }} jobs for {{ componentId }} that do not have modelVersion={{ modelVersion }}"
  kubernetes.core.k8s_info:
    kind: Job
    namespace: "{{ iotNamespace }}"
    label_selectors:
      - "iot.ibm.com/model = {{ modelType }}"
      - "app.kubernetes.io/component = {{ componentId }}"
      - "app.kubernetes.io/version != {{ modelVersion }}"
  register: jobResults


Also, not sure if the "job parameters" section needs to also be edited to include "cr_api_version"

- name: job parameters
  debug:
    msg:
      - "job_name: {{ job_name }}"
      - "cr_kind: {{ cr_kind }}"
      - "componentId: {{ componentId }}"
      - "modelType: {{ modelType }}"
      - "modelVersion: {{ modelVersion }}"
      - "job_image_spec: {{ job_image_spec }}"
      - "job_resources_spec: {{ job_resources_spec }}"

rene-oromtz avatar Jan 29 '25 01:01 rene-oromtz

This file also needs attention: /opt/ansible/.ansible/collections/ansible_collections/iot/util/roles/check_pods_ready/tasks/check_one.yaml

- name: Check {{ item }} has minimum availability
    kubernetes.core.k8s_info:
      api_version: v1 <<<<< should be apps/v1
      kind: Deployment
      namespace: "{{ iotNamespace }}"
      label_selectors:
        - app={{ item }}
    register: deployment_result

Also/opt/ansible/roles/edgeconfig/tasks/mas_certsroutes.yml

- name: Get host info of server route to pass to later deployments
  kubernetes.core.k8s_info:
    api_version: v1 <<<<< should be route.openshift.io/v1
    name: "{{ instanceId }}-iot-edgeconfigapi"
    namespace: "{{ iotNamespace }}"
    kind: Route
  register: server_route

rene-oromtz avatar Jan 29 '25 01:01 rene-oromtz

After editing all of those files from container, the edgeconfig operator was able to complete successfully:

----- Ansible Task Status Event StdOut (components.iot.ibm.com/v1, Kind=Edgeconfig, maximus/mas-maximus-iot) -----
PLAY RECAP *********************************************************************
localhost : ok=79 changed=1 unreachable=0 failed=0 skipped=43 rescued=0 ignored=0

rene-oromtz avatar Jan 29 '25 01:01 rene-oromtz

Thanks for reporting this - apologies for the long delay, with various staffing changes it seems we lost any active monitoring of issues in this repository. This is a product bug rather than something we can fix in this ansible collection. I have raised an internal issue with the product team: IBMIOT-835 and we'll work to get this fixed ASAP

durera avatar May 14 '25 10:05 durera

Update: This was fixed in late April (ref: IBMIOT-763) .. just no-one linked this back to this issue here 👍

durera avatar May 14 '25 11:05 durera