Wrong robots.txt file generated
In certain cases, the robots.txt files generated every 7 days do not match what was saved via the settings of the robotstxt module. The issue is that the settings are stored in the Redis database, while the curl call for regenerating the file in daily.sh specifies "noredis=1". As a result, an old setting from the Drupal SQL database cache is taken into account.
Related issue : https://www.drupal.org/project/redis/issues/3177375
Hmm.. not sure how the linked issues is related, given we purge cache tables hourly.. or at least daily.
Furthermore, no settings should be ever stored in cache tables, so not sure why noredis would cause problems.
Which version of the module you have checked to confirm it's saving its settings in cache?
"... given we purge cache tables hourly.. or at least daily." I wasn't aware of that.
So indeed, if the database caches are cleared, it's fine. But for that, the locally installed SQL database by Barracuda must be used. My issue comes from the fact that I'm using an external database server, for which mysql_cleanup.sh doesn't seem to have any effect.
I "fixed" it by clearing manually the cache tables. Removing "noredis" param is also fine because the cached version in Redis is up to date.