salt
salt copied to clipboard
[BUG] Unable to execute cron-style Jobs on CentOS7 / Python3
Description Currently i intend to let our minions execute a highstate-job at 4:15am declaring the job with a cron-like syntax in the pillar:
schedule:
highstate-base:
function: state.highstate
cron: '15 4 * * *'
Even though i follow the official configuration example i cannot get the job executed. In the minion's log i get the following error:
2020-06-12 09:11:38,708 [salt.utils.schedule:1088][ERROR ][4294] Missing python-croniter. Ignoring job highstate-base.
2020-06-12 09:11:39,677 [salt.utils.schedule:1088][ERROR ][4294] Missing python-croniter. Ignoring job highstate-base.
2020-06-12 09:11:40,677 [salt.utils.schedule:1088][ERROR ][4294] Missing python-croniter. Ignoring job highstate-base.
I have tried to solve the issue by running pip3 install croniter but this did not help.
Steps to Reproduce the behavior
- install a fresh VM using Centos7 (CentOS-7-x86_64-Minimal-1908.iso)
- install the salt-master and the salt-minion from RPMs (2019.2 / Py3)
- configure and accept the minion to contact the salt-master on localhost
- create a new schedule-configurationfile ("etc/salt/minion.d/basehighstate.conf") with a cron-style schedule-time like shown above
- restart the minion
- looking at the minion's log you should see the ERROR messages indicating that croniter is not available
Expected behavior A clear and concise description of what you expected to happen.
Screenshots If applicable, add screenshots to help explain your problem.
Versions Report
salt --versions-report
(Provided by running salt --versions-report. Please also mention any differences in master/minion versions.)Salt Version:
Salt: 2019.2.5
Dependency Versions:
cffi: Not Installed
cherrypy: Not Installed
dateutil: Not Installed
docker-py: Not Installed
gitdb: Not Installed
gitpython: Not Installed
ioflo: Not Installed
Jinja2: 2.8.1
libgit2: Not Installed
libnacl: Not Installed
M2Crypto: 0.33.0
Mako: Not Installed
msgpack-pure: Not Installed
msgpack-python: 0.5.6
mysql-python: Not Installed
pycparser: Not Installed
pycrypto: Not Installed
pycryptodome: Not Installed
pygit2: Not Installed
Python: 3.6.8 (default, Apr 2 2020, 13:34:55)
python-gnupg: Not Installed
PyYAML: 3.11
PyZMQ: 15.3.0
RAET: Not Installed
smmap: Not Installed
timelib: Not Installed
Tornado: 4.4.2
ZMQ: 4.1.4
System Versions:
dist: centos 7.7.1908 Core
locale: UTF-8
machine: x86_64
release: 3.10.0-1062.el7.x86_64
system: Linux
version: CentOS Linux 7.7.1908 Core
@bit-punk Thank you for reporting this issue, I am seeing the same thing.
Thanks.
@bit-punk Apologies for the delay on this one. I wasn't able to reproduce it. Can you provide the output from pip3 freeze? Thanks!
The same issue from my side python3-croniter already installed
# dpkg -l | grep croniter
ii python3-croniter 0.3.29-2ubuntu1
# pip3 install croniter
Requirement already satisfied: croniter in /usr/lib/python3/dist-packages (0.3.29)
And get the same error
salt-minion[728]: [ERROR ] Missing python-croniter. Ignoring job os_update.
Any update?
Same here, but on Ubuntu 22.04
Jan 28 12:48:56 host salt-minion[56675]: [ERROR ] Missing python-croniter. Ignoring job highstate_conformity.
It was the same for me too. In my case I fixed it by restarting the salt-minion service after installing croniter with pip.
According to this documentation, croniter must be installed on the minion, so I'm providing data from the minion's perspective. However, the master looks very similar, and I can provide that info if needed.
sudo salt-call pip.list
local:
----------
...
contextvars:
2.4
croniter:
1.3.15
cryptography:
39.0.2
...
sudo salt-call --versions-report
Salt Version:
Salt: 3006.1
Python Version:
Python: 3.10.11 (main, May 5 2023, 02:31:54) [GCC 11.2.0]
Dependency Versions:
cffi: 1.14.6
cherrypy: 18.6.1
dateutil: 2.8.1
docker-py: Not Installed
gitdb: Not Installed
gitpython: Not Installed
Jinja2: 3.1.2
libgit2: Not Installed
looseversion: 1.0.2
M2Crypto: Not Installed
Mako: Not Installed
msgpack: 1.0.2
msgpack-pure: Not Installed
mysql-python: Not Installed
packaging: 22.0
pycparser: 2.21
pycrypto: Not Installed
pycryptodome: 3.9.8
pygit2: Not Installed
python-gnupg: 0.4.8
PyYAML: 5.4.1
PyZMQ: 23.2.0
relenv: 0.12.3
smmap: Not Installed
timelib: 0.2.4
Tornado: 4.5.3
ZMQ: 4.3.4
System Versions:
dist: amzn 2
locale: utf-8
machine: x86_64
release: 4.14.314-237.533.amzn2.x86_64
system: Linux
version: Amazon Linux 2
Croniter is installed with this state:
include:
- apps.pip3
croniter_install:
pip.installed:
- name: croniter
- require:
- pkg: pip3_install
The company I work for, uses salt in all our environments. We have been experiencing issues with schedules for a over a year, with every salt version up to 3006.x, and have tried many times to get salt to detect croniter. I was holding out, and waiting for onedir to be fully rolled out, as I was hoping the virtual environment would solve the issues.
The info above comes from a fresh install, on a new VM. We have even tried installing croniter using the pakcage manager, pip, and yum. This is happening across our entire infrastructure on every VM.
I prefer not installing duplicate packages on the system with a mixture of pip, yum/apt, and in the salt onedir virtualenv. This makes a mess, causes package conflicts, and is a nightmare to troubleshoot when something goes wrong. All packages should be installed via pip, in onedir, period, full stop.
We have corporate policies that every host must be rebuilt, from the ground up, every 30 days. So, the VM above, which is an application box (minion), and the salt-master, are all brand new VMs with a fresh install of salt, croniter, etc.
It should also be noted, that if you install pip packages on the master, with sudo salt-pip install foo, that package will be installed as root, not the salt user. You will get permission issues all over, on all dependencies for that package. You must then fix permissions manually or salt will trip over itself. This can be avoided with modifying the command slightly, sudo -u salt salt-pip install foo.
From the minion, the command sudo salt-call pip.install foo works fine, and the proper permissions are used for the salt user when the package gets installed.
We recently upgraded from 3005.1 to 3006.1. Which btw, the milestone for 3006.1 said TBD, and the repo pulled 3005 out from under us. We were not able to properly plan a migration, since the milestone never said anything but TBD on the time frame. We are past all that, but surprises like that don't help us justify our reasons to continue using saltstack.
We also had issues with the bootstrap install script. It doesn't seem to be consistent and despite supplying a version, like bootstrap.sh stable 3005.1, it seems to do whatever it wants. Some boxes gave us 3005.1, and some installed 3006.1, regardless of the version supplied. I had to use a manual install method to obtain consistency with version pinning. Just fyi.
Until we get traction on this, and the scheduler becomes less flaky, we have decided to use the anti-pattern of setting up cron jobs on each host. I should also note, that due to the issues we've had with salt, management is having us consider using another product.
Have any updates been made to the croniter not or improperly installed on salt-minions? Or perhaps an alternative method to executing salt states on minions apart from setting up native (to minion) cron solutions?
cronitor appears to be spelled wrong in the import: https://github.com/saltstack/salt/blob/e4c1da4323f6334c0a14e9a68392a0c12a37e117/salt/utils/schedule.py#L60 - changing it to cronitor fixed it for me.
@wasabi222 I think the problem is not the spelling, see the following:
https://pypi.org/project/croniter/
https://pypi.org/project/cronitor/
The former is more applicable to schedules.
Correct fix is to install croniter with salt-pip. there is a an error logged when using the scheduler and attempting to use croniter functionality, but will discuss with the core team about getting croniter added to the packaged install as default when packaging Salt, it is used when testing in Salt's CI environment.
@bit-punk Closing this as the associated PR is merged into the branch and should appear in the next release of Salt.