docker
docker copied to clipboard
Archiving Maximum Memory Incorrect
When archiving:
INFO [2020-12-30 22:00:01] 15 Downloading https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-City&suffix=tar.gz&license_key=<REDACTED> to /var/www/html/tmp/latest/GeoIP2-City.mmdb.tar.gz.download.
INFO [2020-12-30 22:00:05] 15 GeoIP2AutoUpdater: successfully downloaded 'https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-City&suffix=tar.gz&license_key=<REDACTED>'
ERROR [2020-12-30 22:00:06] 15 Fatal error encountered: /var/www/html/vendor/pear/archive_tar/Archive/Tar.php(1903): Allowed memory size of 134217728 bytes exhausted (tried to allocate 65011744 bytes)
on /var/www/html/vendor/pear/archive_tar/Archive/Tar.php(1903)
Matomo encountered an error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 65011744 bytes) (which lead to: Error: array (
'type' => 1,
'message' => 'Allowed memory size of 134217728 bytes exhausted (tried to allocate 65011744 bytes)',
'file' => '/var/www/html/vendor/pear/archive_tar/Archive/Tar.php',
'line' => 1903,
'backtrace' => ' on /var/www/html/vendor/pear/archive_tar/Archive/Tar.php(1903)
',
))
So it's limiting it to 128 MiB, but that doesn't make sense to me since in config/global.ini.php we have:
; Minimum advised memory limit in Mb in php.ini file (see memory_limit value)
; Set to "-1" to always use the configured memory_limit value in php.ini file.
minimum_memory_limit = 128
; Minimum memory limit in Mb enforced when archived via ./console core:archive
; Set to "-1" to always use the configured memory_limit value in php.ini file.
minimum_memory_limit_when_archiving = 768
I do not have anything related to memory in config/config.ini.php. So it should be limiting it to 768 MiB in this scenario, right?
My cron command:
/usr/bin/docker exec --user www-data matomo-01 /usr/local/bin/php /var/www/html/console core:archive --url=https://<REDACTED>
Just a minor comment that might explain this: The error doesn't occur during archiving, but rather during running of the scheduled tasks after the archiving, so maybe that memory limit doesn't apply there.
Ah, that would make sense. Seems reasonable to apply it there too since the downlaod is that big? Or is this some other compounding issue? Would that be a pull to this repo or the main Matomo repo?
Is there a quick fix or workaround I could implement in my install? This is causing a cron error every little while (when it updates the geo database).
The apache container doesn't appear to have a php.ini file for the cli:
# find / -name \*.ini
/usr/local/etc/php/conf.d/php-matomo.ini
/usr/local/etc/php/conf.d/opcache-recommended.ini
/usr/local/etc/php/conf.d/docker-php-ext-apcu.ini
/usr/local/etc/php/conf.d/docker-php-ext-gd.ini
/usr/local/etc/php/conf.d/docker-php-ext-ldap.ini
/usr/local/etc/php/conf.d/docker-php-ext-pdo_mysql.ini
/usr/local/etc/php/conf.d/docker-php-ext-zip.ini
/usr/local/etc/php/conf.d/docker-php-ext-opcache.ini
/usr/local/etc/php/conf.d/docker-php-ext-redis.ini
/usr/local/etc/php/conf.d/docker-php-ext-mysqli.ini
/usr/local/etc/php/conf.d/docker-php-ext-sodium.ini
root@92985cc5978e:/#
You could try setting it explicitly on the command line:
MAILTO="[email protected]"
5 * * * * www-data /usr/local/bin/php -d memory_limit=768M /var/www/html/console core:archive --url=http://example.org/matomo/ > /home/example/matomo-archive.log
See here for more info: https://matomo.org/docs/setup-auto-archiving/#linux-unix-how-to-set-up-a-crontab-to-automatically-archive-the-reports
Sorry for the late response, but that definitely did the trick! May I recommend this be added to the documentation somewhere? I originally saw that page but was confused because of the issue you mentioned (no php.ini file for the CLI). Or, maybe a php.ini could be added to the container.