ansible-plugin icon indicating copy to clipboard operation
ansible-plugin copied to clipboard

node is empty but in service.log can see it gathering facts from inventory file

Open rangsan opened this issue 8 years ago • 7 comments

Rundeck Details

rundeck version : rundeck-2.8.2-1.31.GA.noarch (RPM)
ansible version : 2.3.1.0 (PIP)
ansible plugin : ansible-plugin-2.0.2.jar
OS version : Centos 6.9

The node is empty on rundeck UI. However I did take a look at service.log it was gathering the facts from every nodes in my inventory.


PLAY [all] *********************************************************************

TASK [Gathering Facts] *********************************************************
ok: [mon-8002]
ok: [mon-8003]
ok: [mon-8001]
ok: [mon-6003]
ok: [mon-6001]
ok: [mon-6002]
ok: [mon-3003]
...
drwxr-xr-x. 14 rundeck rundeck 4096 Jun 27 16:14 /var/lib/rundeck/

rundeck is an owner of home path. I can also ping all servers with this command su rundeck -s /bin/bash -c "ansible all -m ping"

I created symlink of ansible.cfg in /etc/ansible/ansifile.cfg to /opt/ansible/etc/ansible.cfg inside rundeck user. I also have environment for rundeck user point to ANSIBLE_CONFIG=/opt/ansible/etc/ansible.cfg.

But still have no nodes appears in rundeck UI. Could someone please shed the light on me? Thanks so much.

rangsan avatar Jun 27 '17 10:06 rangsan

Please someone help me to solve this issue. I stuck for a week.

rangsan avatar Jul 03 '17 04:07 rangsan

TASK [file] ******************************************************************** task path: /tmp/rundeck/ansible-hosts8961719020255916456/gather-hosts.yml:7 Using module file /usr/lib/python2.6/site-packages/ansible/modules/files/file.py <localhost> ESTABLISH LOCAL CONNECTION FOR USER: rundeck <localhost> EXEC /bin/sh -c 'sudo -H -S -n -u root /bin/sh -c '"'"'echo BECOME-SUCCESS-nldjxudmxlfouxlpouacafjukbpdbtcg; /usr/bin/python'"'"' && sleep 0' fatal: [sg-senmon-6001 -> localhost]: FAILED! => { "changed": false, "failed": true, "module_stderr": "sudo: sorry, you must have a tty to run sudo\n", "module_stdout": "", "msg": "MODULE FAILURE", "rc": 1 }

NO MORE HOSTS LEFT ************************************************************* to retry, use: --limit @/tmp/rundeck/ansible-hosts8961719020255916456/gather-hosts.retry

I open debug mode and found this error log every time I browse for nodes on UI.

rangsan avatar Jul 03 '17 04:07 rangsan

I've been waiting for months for a similar issue (#95), I can't say the person(s) who write this thing care to respond to user issues too much.

Not sure what I would do if I were you...

damageboy avatar Jul 03 '17 16:07 damageboy

[sg-senmon-6001 -> localhost] indicates the task is being executed in local host , so you are getting the sudo tty error in rundeck server.

Comment the line "Defaults requiretty" from the /etc/sudoers files in rundeck server , see if this helps

Sush002 avatar Jul 06 '17 15:07 Sush002

@Sush002 this is not supposed to run on localhost as my inventory is not configured localhost. I think this is about andible file copier plugin. It's strange that I can't configure the file copier plugin in the UI, the value always change to default.
2017-07-17 11_55_11-edit project

rangsan avatar Jul 17 '17 04:07 rangsan

This is the log from service.log.

procArgs: [ansible-playbook, gather-hosts.yml, --inventory-file=/var/lib/rundeck/etc/sensu/inventory, -vvvv, --extra-vars=@/tmp/rundeck/ansible-runner700351975359969651extra-vars, --private-key=/tmp/rundeck/ansible-runner1113985833019383943id_rsa, --user=ansible, --timeout=60] Using /var/lib/rundeck/etc/ansible.cfg as config file Loading callback plugin default of type stdout, v2.0 from /usr/lib/python2.6/site-packages/ansible/plugins/callback/init.pyc

PLAYBOOK: gather-hosts.yml ***************************************************** 1 plays in gather-hosts.yml

PLAY [all] *********************************************************************

TASK [Gathering Facts] ********************************************************* Using module file /usr/lib/python2.6/site-packages/ansible/modules/system/setup.py ESTABLISH SSH CONNECTION FOR USER: ansible SSH: EXEC ssh -vvv -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o 'IdentityFile="/tmp/rundeck/ansible-runner1113985833019383943id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ansible -o ConnectTimeout=60 -o ControlPath=/var/lib/rundeck/.ansible/cp/ansible-ssh-%h-%p-%r sh-senmon-8001 '/bin/sh -c '"'"'sudo -H -S -n -u root /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-qonrnsvjmyazgczoeqlywnuxsujuqaxs; /usr/bin/python'"'"'"'"'"'"'"'"' && sleep 0'"'"'' (0, '\n{"invocation": {"module_args": {"filter": "*", "gather_subset": ["all"], "fact_path": "/etc/ansible/facts.d", "gather_timeout": 10}}, "changed": false, "ansible_facts": {"ansible_product_serial": "5335f0d1-408c-4844-913b-2d0fd4f16491", "ansible_form_factor": "Other", "ansible_product_version": "13.0.0-1.el7", "ansible_fips": false, "ansible_service_mgr": "upstart", "ansible_user_id": "root", "ansible_user_dir": "/root", "ansible_memtotal_mb": 14941, "ansible_system_capabilities": [], "ansible_distribution_version": "6.8", "ansible_domain": "agprod1.agoda.local", "ansible_date_time": {"weekday_number": "1", "iso8601_basic_short": "20170717T121450", "tz": "ICT", "weeknumber": "29", "hour": "12", "year": "2017", "minute": "14", "tz_offset": "+0700", "month": "07", "epoch": "1500268490", "iso8601_micro": "2017-07-17T05:14:50.580046Z", "weekday": "Monday", "time": "12:14:50", "date": "2017-07-17", "iso8601": "2017-07-17T05:14:50Z", "day": "17", "iso8601_basic": "20170717T121450579934", "second": "50"}, "ansible_real_user_id": 0, "ansible_processor_cores": 1, "ansible_virtualization_role": "guest", "ansible_dns": {"nameservers": ["10.113.192.101", "10.113.192.102"], "search": ["hkg.agoda.local", "ams.agoda.local", "agprod1.agoda.local", "agprod2.agoda.local", "agoda.local"], "options": {"attempts": "1", "rotate": true, "timeout": "1"}}, "ansible_processor_vcpus": 8, "ansible_bios_version": "seabios-1.7.5-11.el7", "ansible_processor": ["GenuineIntel", "Intel Core Processor (Haswell, no TSX)", "GenuineIntel", "Intel Core Processor (Haswell, no TSX)", "GenuineIntel", "Intel Core Processor (Haswell, no TSX)", "GenuineIntel", "Intel Core Processor (Haswell, no TSX)", "GenuineIntel", "Intel Core Processor (Haswell, no TSX)", "GenuineIntel", "Intel Core Processor (Haswell, no TSX)", "GenuineIntel", "Intel Core Processor (Haswell, no TSX)", "GenuineIntel", "Intel Core Processor (Haswell, no TSX)"], "ansible_virtualization_type": "openstack", "ansible_lo": {"features": {"tx_checksum_ipv4": "off [fixed]", "generic_receive_offload": "on", "tx_checksum_ipv6": "off [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tx_checksum_unneeded": "off [fixed]", "highdma": "on [fixed]", "tx_lockless": "on [fixed]", "tx_tcp_ecn_segmentation": "on", "tx_gso_robust": "off [fixed]", "tx_checksumming": "on", "vlan_challenged": "on [fixed]", "loopback": "on [fixed]", "fcoe_mtu": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "large_receive_offload": "off [fixed]", "tx_scatter_gather": "on [fixed]", "rx_checksumming": "on [fixed]", "tx_tcp_segmentation": "on", "netns_local": "on [fixed]", "generic_segmentation_offload": "on", "tx_udp_tnl_segmentation": "off [fixed]", "tcp_segmentation_offload": "on", "rx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "tx_vlan_offload": "off [fixed]", "tx_tcp6_segmentation": "on", "udp_fragmentation_offload": "on", "scatter_gather": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "rx_vlan_filter": "off [fixed]", "receive_hashing": "off [fixed]", "tx_gre_segmentation": "off [fixed]"}, "mtu": 65536, "device": "lo", "promisc": false, "ipv4": {"broadcast": "host", "netmask": "255.0.0.0", "network": "127.0.0.0", "address": "127.0.0.1"}, "active": true, "type": "loopback"}, "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_default_ipv4": {"macaddress": "fa:16:3e:e0:62:48", "network": "10.113.195.0", "mtu": 1500, "broadcast": "10.113.195.255", "alias": "eth0", "netmask": "255.255.255.0", "address": "10.113.195.154", "interface": "eth0", "type": "ether", "gateway": "10.113.195.254"}, "ansible_swapfree_mb": 0, "ansible_default_ipv6": {}, "ansible_distribution_release": "Final", "ansible_system_vendor": "Fedora Project", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"LANG": "en_US.UTF-8", "rd_NO_LUKS": true, "rd_NO_LVM": true, "console": "tty0", "ro": true, "KEYBOARDTYPE": "pc", "rd_NO_MD": true, "quiet": true, "rhgb": true, "KEYTABLE": "us", "crashkernel": "129M@0M", "SYSFONT": "latarcyrheb-sun16", "root": "UUID=e99cbc3d-d5b3-4d0e-9dc0-43da6141ea68", "rd_NO_DM": true}, "ansible_effective_user_id": 0, "ansible_mounts": [{"uuid": "N/A", "size_total": 84413566976, "mount": "/", "size_available": 77227778048, "fstype": "ext4", "device": "/dev/vda1", "options": "rw"}], "ansible_selinux": {"status": "enabled", "policyvers": 24, "type": "targeted", "mode": "enforcing", "config_mode": "enforcing"}, "ansible_os_family": "RedHat", "ansible_userspace_architecture": "x86_64", "ansible_product_uuid": "141BBC24-73A6-AA4D-96DF-3C172FD59268", "ansible_product_name": "OpenStack Nova", "ansible_pkg_mgr": "yum", "ansible_memfree_mb": 13385, "ansible_devices": {"vda": {"scheduler_mode": "deadline", "rotational": "1", "vendor": "6900", "sectors": "167772160", "sas_device_handle": null, "sas_address": null, "host": "SCSI storage controller: Red Hat, Inc Virtio block device", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": null, "partitions": {"vda1": {"sectorsize": 512, "uuid": "e99cbc3d-d5b3-4d0e-9dc0-43da6141ea68", "sectors": "167764747", "start": "2048", "holders": [], "size": "80.00 GB"}}, "holders": [], "size": "80.00 GB"}}, "ansible_user_uid": 0, "ansible_memory_mb": {"real": {"total": 14941, "used": 1556, "free": 13385}, "swap": {"cached": 0, "total": 0, "free": 0, "used": 0}, "nocache": {"used": 946, "free": 13995}}, "ansible_distribution": "CentOS", "ansible_env": {"USERNAME": "root", "LANG": "en_US.UTF-8", "TERM": "unknown", "SHELL": "/bin/bash", "SUDO_COMMAND": "/bin/sh -c echo BECOME-SUCCESS-qonrnsvjmyazgczoeqlywnuxsujuqaxs; /usr/bin/python", "SHLVL": "1", "SUDO_UID": "497", "SUDO_GID": "497", "PWD": "/opt/ansible", "LOGNAME": "root", "USER": "root", "PATH": "/sbin:/bin:/usr/sbin:/usr/bin", "MAIL": "/var/mail/ansible", "SUDO_USER": "ansible", "HOME": "/root", "_": "/usr/bin/python"}, "ansible_distribution_major_version": "6", "module_setup": true, "ansible_processor_count": 8, "ansible_hostname": "sh-senmon-8001", "ansible_effective_group_id": 0, "ansible_swaptotal_mb": 0, "ansible_real_group_id": 0, "ansible_bios_date": "04/01/2014", "ansible_all_ipv6_addresses": [], "ansible_interfaces": ["lo", "eth0"], "ansible_uptime_seconds": 6388037, "ansible_machine_id": "b16bf10cfab93e4cd6a8fccd0000000a", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAABIwAAAQEAtuVJ38QGJfPSvjDfJG1JasDcEl8v+r5bXMcAdkkV5lQRAgBnrk7hyeDunnJ8vgEeR2RI+fK29O1aI1j4BfHjqJ9/jFYL+XNi2KNFfBDQhjxymd4U9O0g/tgaNaDo1OKSPyOTjZDs8onLYKhGnzygpNOsMBFqBD/qe5PGN1bjiVCaOwSmgpHf9+c9ZYsNN9tp3udtt/cq4Su9QY2TDYdG1kaRNTS0q60X0rtOUDf1THazJ/eAJc4/Y2qEIfpz/ro6RCmJDljRNYQPYaL9ZUKLeP/8tObyNWwkhYzvyuL8IfrLodhSntYdq2jYVGXk+V5VP5/WmQUkf+ADp8G5kBc1pQ==", "ansible_gather_subset": ["hardware", "network", "virtual"], "ansible_user_gecos": "root", "ansible_system_capabilities_enforced": "False", "ansible_python": {"executable": "/usr/bin/python", "version": {"micro": 6, "major": 2, "releaselevel": "final", "serial": 0, "minor": 6}, "type": "CPython", "has_sslcontext": false, "version_info": [2, 6, 6, "final", 0]}, "ansible_kernel": "2.6.32-642.el6.centos.plus.x86_64", "ansible_processor_threads_per_core": 1, "ansible_fqdn": "sensuapi-sha.agprod1.agoda.local", "ansible_user_gid": 0, "ansible_eth0": {"macaddress": "fa:16:3e:e0:62:48", "features": {"tx_checksum_ipv4": "off [fixed]", "generic_receive_offload": "on", "tx_checksum_ipv6": "off [fixed]", "tx_scatter_gather_fraglist": "off [fixed]", "tx_checksum_unneeded": "off [fixed]", "highdma": "on [fixed]", "tx_lockless": "off [fixed]", "tx_tcp_ecn_segmentation": "on", "tx_gso_robust": "off [fixed]", "tx_checksumming": "on", "vlan_challenged": "off [fixed]", "loopback": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "large_receive_offload": "off [fixed]", "tx_scatter_gather": "on", "rx_checksumming": "on [fixed]", "tx_tcp_segmentation": "on", "netns_local": "off [fixed]", "generic_segmentation_offload": "on", "tx_udp_tnl_segmentation": "off [fixed]", "tcp_segmentation_offload": "on", "rx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "tx_vlan_offload": "off [fixed]", "tx_tcp6_segmentation": "on", "udp_fragmentation_offload": "on", "scatter_gather": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_fcoe_segmentation": "off [fixed]", "rx_vlan_filter": "on [fixed]", "receive_hashing": "off [fixed]", "tx_gre_segmentation": "off [fixed]"}, "pciid": "virtio0", "module": "virtio_net", "mtu": 1500, "device": "eth0", "promisc": false, "ipv4": {"broadcast": "10.113.195.255", "netmask": "255.255.255.0", "network": "10.113.195.0", "address": "10.113.195.154"}, "active": true, "type": "ether"}, "ansible_nodename": "sh-senmon-8001.agprod1.agoda.local", "ansible_system": "Linux", "ansible_user_shell": "/bin/bash", "ansible_machine": "x86_64", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAM2zCXgjEAZOEswIydI4KI+IxvljfUnJiGdiMquj1N5wCUAQDHuVGxfpI5Wt9TRU2VVwxO6dKcu+9wVp3ECIqTahOpnGKJp6gCKvAwy66QU9ZxyDYgWnkuUkEokqglRtBj+4tpzEDfJ3s0teE4lKInJ0MSXivw2aTlzNBxTM4hAlAAAAFQCGR4sm0od/9PPuFaJ7+zLzPuo0WQAAAIEAgI1lz3jAQDGaX9lth5gdBkT/n2CcMbWTT59YS13V+5U5UhktlUIXiVS41wdxN3axTVYkBG43Qg9HmrWCz5J9Jl+YoT1mksvdOE+1n5GxAaLL1kWKwRRY+XDLsSXoxp+l/gpoTCbQEHzVkmYXr5h3SkMl60uv+dniK92QCoJliMUAAACAb64CGpDXWtj4JPRGDDfP4cT3OjckUWKVU1lMwsBOS/Jj/9+tqwln1BwG4Tp5YOwPesNZei1zdLCjsQWy2BMpsMbhNwxDuO7Qx5e33VqS5OyVNHCgRv7+fprtOvDHXTiTWDH0lNXq37yC51gSvylJIvC/MWkCw2LruzBYFDHJgjE=", "ansible_all_ipv4_addresses": ["10.113.195.154"], "ansible_python_version": "2.6.6"}}\n', 'OpenSSH_5.3p1, OpenSSL 1.0.1e-fips 11 Feb 2013\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_request_forwards: requesting forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 30823\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\n/tmp/ansible_nzDH4w/ansible_modlib.zip/ansible/module_utils/facts.py:1019: DeprecationWarning: object.new() takes no parameters\n/tmp/ansible_nzDH4w/ansible_modlib.zip/ansible/module_utils/facts.py:2438: DeprecationWarning: object.new() takes no parameters\n/tmp/ansible_nzDH4w/ansible_modlib.zip/ansible/module_utils/facts.py:3342: DeprecationWarning: object.new() takes no parameters\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n') ok: [sh-senmon-8001] META: ran handlers

TASK [file] ******************************************************************** task path: /tmp/rundeck/ansible-hosts8821188797572841367/gather-hosts.yml:7 Using module file /usr/lib/python2.6/site-packages/ansible/modules/files/file.py ESTABLISH LOCAL CONNECTION FOR USER: rundeck EXEC /bin/sh -c 'sudo -H -S -n -u root /bin/sh -c '"'"'echo BECOME-SUCCESS-kuhefpwbysaykkwztqsovxgbwpihmcob; /usr/bin/python'"'"' && sleep 0' fatal: [sh-senmon-8001 -> localhost]: FAILED! => { "changed": false, "failed": true, "module_stderr": "sudo: sorry, you must have a tty to run sudo\n", "module_stdout": "", "msg": "MODULE FAILURE", "rc": 1 }

NO MORE HOSTS LEFT ************************************************************* to retry, use: --limit @/tmp/rundeck/ansible-hosts8821188797572841367/gather-hosts.retry

PLAY RECAP ********************************************************************* sh-senmon-8001 : ok=1 changed=0 unreachable=0 failed=1

rangsan avatar Jul 17 '17 05:07 rangsan

Hello! The same problem with Ansible Resource Model - when trying to get nodes from ansible, it runs some ansible task on localhost with sudo (of course, it does not have sudo on localhost and will not have any due to security reasons). I found /var/lib/rundeck/.ansible/tmp/ansible-tmp-*/file.py file, which tries to unzip some of the embedded zip data... But why on localhost? Is he trying to deploy a wrapper for a rundeck like this?

SergeyBear avatar Jan 11 '18 08:01 SergeyBear