f5-ansible icon indicating copy to clipboard operation
f5-ansible copied to clipboard

bigip_pool_member unable to change the state of an existing member.

Open santizo opened this issue 3 years ago • 4 comments

ISSUE TYPE
  • Bug Report
COMPONENT NAME

bigip_pool_member

ANSIBLE VERSION
ansible --version
ansible 2.10.6
  config file = /synced_folder/mpki-cagw/ansible.cfg
  configured module search path = ['/home/vagrant/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /home/vagrant/.pyenv/versions/3.7.9/envs/f5-py3.7-ansible3/lib/python3.7/site-packages/ansible
  executable location = /home/vagrant/.pyenv/versions/f5-py3.7-ansible3/bin/ansible
  python version = 3.7.9 (default, Mar  5 2021, 16:52:23) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]

PYTHON VERSION
Python 3.7.9
BIGIP VERSION
Sys::Version
Main Package
  Product     BIG-IP
  Version     14.1.4
  Build       0.0.11
  Edition     Final
  Date        Thu Feb 11 19:05:03 PST 2021

CONFIGURATION
[defaults]
retry_files_enabled = False
host_key_checking = false
collections_path = collections/

OS / ENVIRONMENT
CentOS Linux release 7.9.2009 (Core)
SUMMARY

At some point in the last few months the functionality that allowed bigip_pool_member to change the state of a pool member stopped working. I am unsure as to the cause. I have tried multiple python versions 2.7.18,3.6.10,3.8.6,3.7.9, ansible2.9,2.10,3.0, BIG IP 14.1.2.8-14.1.4. The module is able to create pool members without issue, it is also able to evaluate the state of a pool member and accurately decide it is already in the intended state. Ex. If I try to enable a member that is already enabled it runs/reports back 'ok/no changes'. The issue only seems to occur when attempting to update an existing pool member.

STEPS TO REPRODUCE

Try changing the state of an existing pool member.

- hosts: localhost
  gather_facts: yes
  tasks:
    - name: "Disabling a pool member"
      f5networks.f5_modules.bigip_pool_member:
        state: disabled
        partition: Common
        pool: POOL-ComSec-389
        host: 10.151.16.57
        port: 389
        provider:
          server: 172.27.20.210
          user:  "{{ bigip_user }}"
          password: "{{ bigip_password }}"
          validate_certs: no

EXPECTED RESULTS

I expect it to display a status of changed and for the pool member to have changed status as requested.

ACTUAL RESULTS
The full traceback is:
  File "/tmp/ansible_f5networks.f5_modules.bigip_pool_member_payload_0t4ut0ca/ansible_f5networks.f5_modules.bigip_pool_member_payload.zip/ansible_collections/f5networks/f5_modules/plugins/modules/bigip_pool_member.py", line 1676, in main
  File "/tmp/ansible_f5networks.f5_modules.bigip_pool_member_payload_0t4ut0ca/ansible_f5networks.f5_modules.bigip_pool_member_payload.zip/ansible_collections/f5networks/f5_modules/plugins/modules/bigip_pool_member.py", line 1062, in exec_module
  File "/tmp/ansible_f5networks.f5_modules.bigip_pool_member_payload_0t4ut0ca/ansible_f5networks.f5_modules.bigip_pool_member_payload.zip/ansible_collections/f5networks/f5_modules/plugins/modules/bigip_pool_member.py", line 1157, in execute
  File "/tmp/ansible_f5networks.f5_modules.bigip_pool_member_payload_0t4ut0ca/ansible_f5networks.f5_modules.bigip_pool_member_payload.zip/ansible_collections/f5networks/f5_modules/plugins/modules/bigip_pool_member.py", line 1170, in present
  File "/tmp/ansible_f5networks.f5_modules.bigip_pool_member_payload_0t4ut0ca/ansible_f5networks.f5_modules.bigip_pool_member_payload.zip/ansible_collections/f5networks/f5_modules/plugins/modules/bigip_pool_member.py", line 1187, in update
  File "/tmp/ansible_f5networks.f5_modules.bigip_pool_member_payload_0t4ut0ca/ansible_f5networks.f5_modules.bigip_pool_member_payload.zip/ansible_collections/f5networks/f5_modules/plugins/modules/bigip_pool_member.py", line 1442, in update_on_device
fatal: [localhost]: FAILED! => {
    "changed": false,
    "invocation": {
        "module_args": {
            "address": "10.151.16.57",
            "aggregate": null,
            "availability_requirements": null,
            "connection_limit": null,
            "description": "Test",
            "fqdn": null,
            "fqdn_auto_populate": null,
            "ip_encapsulation": null,
            "monitors": null,
            "name": null,
            "partition": "Common",
            "pool": "POOL-ComSec-389",
            "port": 389,
            "preserve_node": null,
            "priority_group": null,
            "provider": {
                "auth_provider": null,
                "no_f5_teem": null,
                "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
                "server": "172.27.20.210",
                "server_port": null,
                "timeout": null,
                "transport": "rest",
                "user": "alex",
                "validate_certs": false
            },
            "rate_limit": null,
            "ratio": null,
            "replace_all_with": false,
            "reuse_nodes": true,
            "state": "disabled"
        }
    },
    "msg": "Expecting value: line 1 column 1 (char 0)"
}

If anyone knows of a specific version of ansible/python/any particular package that has this working with BIG IP14.1.4 that would be great in the meantime.

santizo avatar Mar 05 '21 17:03 santizo

Hi,

I've tested with BIG-IP v14.1.2.7 and f5_modules v1.8 and it works as expected:

Pool member 172.16.2.80:80 is enabled

---

- hosts: default
  gather_facts: false
  vars: 
    provider:
      server: 192.168.143.154
      user: admin
      password: XxXXxXXxX
      validate_certs: no
      server_port: 443

  tasks:
    - name: "Disabling a pool member"
      f5networks.f5_modules.bigip_pool_member:
        state: disabled
        partition: Common
        pool: test-pool
        host: 172.16.2.80
        port: 80
        provider: "{{ provider }}"
      delegate_to: localhost

(Ansible) PAR-ML-00026375:test_pool_member menant$ ansible-playbook -i hosts -vvv site.yml
ansible-playbook 2.10.3
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/Users/menant/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /Users/menant/projects/python-virtualenv/Ansible/lib/python3.9/site-packages/ansible
  executable location = /Users/menant/projects/python-virtualenv/Ansible/bin/ansible-playbook
  python version = 3.9.0 (default, Nov 21 2020, 14:01:50) [Clang 12.0.0 (clang-1200.0.32.27)]
Using /etc/ansible/ansible.cfg as config file
host_list declined parsing /Users/menant/projects/tests/test-ansible/test_pool_member/hosts as it did not pass its verify_file() method
script declined parsing /Users/menant/projects/tests/test-ansible/test_pool_member/hosts as it did not pass its verify_file() method
auto declined parsing /Users/menant/projects/tests/test-ansible/test_pool_member/hosts as it did not pass its verify_file() method
Parsed /Users/menant/projects/tests/test-ansible/test_pool_member/hosts inventory source with ini plugin
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.

PLAYBOOK: site.yml **********************************************************************************************************************************************************************************************************************************
1 plays in site.yml

PLAY [default] **************************************************************************************************************************************************************************************************************************************
META: ran handlers

TASK [Disabling a pool member] **********************************************************************************************************************************************************************************************************************
task path: /Users/menant/projects/tests/test-ansible/test_pool_member/site.yml:14
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: menant
<localhost> EXEC /bin/sh -c 'echo ~menant && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /Users/menant/.ansible/tmp `"&& mkdir "` echo /Users/menant/.ansible/tmp/ansible-tmp-1615190346.583864-3384-165605025639722 `" && echo ansible-tmp-1615190346.583864-3384-165605025639722="` echo /Users/menant/.ansible/tmp/ansible-tmp-1615190346.583864-3384-165605025639722 `" ) && sleep 0'
redirecting module_util ansible.module_utils.compat.ipaddress to ansible_collections.ansible.netcommon.plugins.module_utils.compat.ipaddress
redirecting module_util ansible.module_utils.compat.ipaddress to ansible_collections.ansible.netcommon.plugins.module_utils.compat.ipaddress
redirecting module_util ansible.module_utils.network.common.utils to ansible_collections.ansible.netcommon.plugins.module_utils.network.common.utils
redirecting module_util ansible.module_utils.network.common.utils to ansible_collections.ansible.netcommon.plugins.module_utils.network.common.utils
Using module file /Users/menant/.ansible/collections/ansible_collections/f5networks/f5_modules/plugins/modules/bigip_pool_member.py
<localhost> PUT /Users/menant/.ansible/tmp/ansible-local-3381apzitxgp/tmp5upzmjh8 TO /Users/menant/.ansible/tmp/ansible-tmp-1615190346.583864-3384-165605025639722/AnsiballZ_bigip_pool_member.py
<localhost> EXEC /bin/sh -c 'chmod u+x /Users/menant/.ansible/tmp/ansible-tmp-1615190346.583864-3384-165605025639722/ /Users/menant/.ansible/tmp/ansible-tmp-1615190346.583864-3384-165605025639722/AnsiballZ_bigip_pool_member.py && sleep 0'
<localhost> EXEC /bin/sh -c '/Users/menant/projects/python-virtualenv/Ansible/bin/python /Users/menant/.ansible/tmp/ansible-tmp-1615190346.583864-3384-165605025639722/AnsiballZ_bigip_pool_member.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /Users/menant/.ansible/tmp/ansible-tmp-1615190346.583864-3384-165605025639722/ > /dev/null 2>&1 && sleep 0'
changed: [192.168.143.154] => {
    "changed": true,
    "invocation": {
        "module_args": {
            "address": "172.16.2.80",
            "aggregate": null,
            "availability_requirements": null,
            "connection_limit": null,
            "description": null,
            "fqdn": null,
            "fqdn_auto_populate": null,
            "host": "172.16.2.80",
            "ip_encapsulation": null,
            "monitors": null,
            "name": null,
            "partition": "Common",
            "pool": "test-pool",
            "port": 80,
            "preserve_node": null,
            "priority_group": null,
            "provider": {
                "auth_provider": null,
                "no_f5_teem": null,
                "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
                "server": "192.168.143.154",
                "server_port": 443,
                "timeout": null,
                "transport": "rest",
                "user": "admin",
                "validate_certs": false
            },
            "rate_limit": null,
            "ratio": null,
            "replace_all_with": false,
            "reuse_nodes": true,
            "state": "disabled"
        }
    },
    "monitors": [],
    "session": "user-disabled",
    "state": "disabled"
}
META: ran handlers
META: ran handlers

PLAY RECAP ******************************************************************************************************************************************************************************************************************************************
192.168.143.154            : ok=1    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

image

If i re-run my playbook as-is:

(Ansible) PAR-ML-00026375:test_pool_member menant$ ansible-playbook -i hosts site.yml

PLAY [default] **************************************************************************************************************************************************************************************************************************************

TASK [Disabling a pool member] **********************************************************************************************************************************************************************************************************************
ok: [192.168.143.154]

PLAY RECAP ******************************************************************************************************************************************************************************************************************************************
192.168.143.154            : ok=1    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

Now i switch my pool member back to enabled:

  tasks:
    - name: "Disabling a pool member"
      f5networks.f5_modules.bigip_pool_member:
        state: enabled
        partition: Common
        pool: test-pool
        host: 172.16.2.80
        port: 80
        provider: "{{ provider }}"
      delegate_to: localhost
(Ansible) PAR-ML-00026375:test_pool_member menant$ ansible-playbook -i hosts site.yml

PLAY [default] **************************************************************************************************************************************************************************************************************************************

TASK [Disabling a pool member] **********************************************************************************************************************************************************************************************************************
changed: [192.168.143.154]

PLAY RECAP ******************************************************************************************************************************************************************************************************************************************
192.168.143.154            : ok=1    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

image

Could you try with the latest collection ?

nmenant avatar Mar 08 '21 08:03 nmenant

Hi, I have tried with the latest(F5 modules) one as of a few days ago, but I'll see if there are any newer ones. The version of BIGIP you are on is the last one we think it was working on, we haven't been able to go that far back to verify it was working for us yet though. Thanks for confirming.

santizo avatar Mar 09 '21 15:03 santizo

Downgraded back to 14.1.2.7, can confirm it works there. Thank you.

santizo avatar Mar 10 '21 22:03 santizo

I did some more tests since you shouldn't have to downgrade your BIG-IP. I tried with one of our latest version and it seems to work well also (v16)

(Ansible) PAR-ML-00026375:test_pool_member menant$ ansible-playbook -i hosts site.yml

PLAY [default] ***********************************************************************************************************************************************

TASK [Disabling a pool member] *******************************************************************************************************************************
changed: [192.168.143.156]

PLAY RECAP ***************************************************************************************************************************************************
192.168.143.156            : ok=1    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

I think something was off with your BIG-IP. We didn't changed anything with the release you mentioned

nmenant avatar Mar 12 '21 10:03 nmenant

Closing this now. Reopen if you still face the issue. Thanks!

KrithikaChidambaram avatar Dec 08 '22 18:12 KrithikaChidambaram