richdocumentscode icon indicating copy to clipboard operation
richdocumentscode copied to clipboard

Collabora Online - Built-in CODE Server - log size

Open amg-web opened this issue 2 years ago • 16 comments

we are using Collabora Online - Built-in CODE Server after 2 months without restart log file became very big (~2.8 Gb) in /tmp partition using big part of space also remaining some flog files after restarts. need to configure log rotation and removal to reduce logs size.

amg-web avatar Jun 18 '22 08:06 amg-web

I didn't realize that before, but we now also have this issue running on NC 29.0.1 with a size of the /tmp/coolwsd.*/jails directory of around 54G The uptime is 34 days and we are using Collabora / Nextcloud Office only occasionally.

mokkin avatar Jun 03 '24 09:06 mokkin

Same problem here. I just realized that /tmp/systemd-private-*-apache2.service-*/tmp/coolwsd.*/jails also reached 54GB. Uptime around 109 days (but apache2 was restarted several times). I just stopped apache2, deleted its /tmp/systemd-private-*-apache2.service-*/ directory and restarted it. Let's see if it comes back...

Configuration: Nextcloud 29.0.2 Collabora Online Development Edition 24.04.2.1 80a6f97 Apache/2.4.59 (Debian)

red3333 avatar Jun 13 '24 12:06 red3333

Same here, right after updating to Nextcloud v29.0.2 from Nextcloud 28. It filled up my /tmp disk. I removed part of the offending files from /tmp, but the directory is still growing at an alarming rate. How to fix?

Cris70 avatar Jun 17 '24 10:06 Cris70

I have same problem. It's growning really fast.

djonik1562 avatar Jul 31 '24 18:07 djonik1562

same problem here ==>

https://github.com/CollaboraOnline/richdocumentscode/issues/176

/tmp/coolwsd.*/jails is growing very fast - 100GB in 24h !

Without monitoring the system - this would lead to a severe server crash

Githopp192 avatar Aug 04 '24 12:08 Githopp192

Yeap, I can confirm this is an issue with Nextcloud 29.0.4 and Collabora Online 24.4.502. Server crashed this morning and had to hard reboot it. After 5 hours, it's already at 17GB (output from du 17G ./coolwsd.*/jails).

epidemiaf1 avatar Aug 08 '24 22:08 epidemiaf1

Hello,

Same here, after updating to Nextcloud 29 from 28 on docker-compose stack.

collabora/code:latest and nextcloud:stable from docker hub.

bastien30 avatar Aug 09 '24 09:08 bastien30

Same here, right after updating to Nextcloud v29.0.2 from Nextcloud 28. It filled up my /tmp disk. I removed part of the offending files from /tmp, but the directory is still growing at an alarming rate. How to fix?

@Cris70 - what i did as workaround .. developped some script, which will:

  • check /tmp/php-fpm ever 15 minutes

  • defined a critical threshold value for /tmp and very critical threshold value for /tmp

  • if critcal threshold value is reached, script will check if this is within business hours, when not, then

    • set cloud maintenance: on, wait 5 minutes
    • restart apache,php-fpm, redis - this will clear the *coolwsd" stuff
      • set cloud maintenance: off
  • if very critcal threshold value is reached, script force to immediately :

  • set cloud maintenance: on, wait 5 minutes
  • restart apache,php-fpm, redis - this will clear the *coolwsd" stuff
  • set cloud maintenance: off

Githopp192 avatar Aug 09 '24 11:08 Githopp192

We do use Nextcloud office but it isn't a critical part of the setup. I temporarily disabled it and the Collabora Online server within Nextcloud. Had to restart apache and php8.2-fpm service as it kept growing the jail files regardless.

epidemiaf1 avatar Aug 09 '24 13:08 epidemiaf1

Same here after update to NC from 28.0.4 to 29.0.5 I've disabled CODE Server and Nextcloud Office

ulfkosack avatar Aug 16 '24 11:08 ulfkosack

Same issue here /tmp/ folder is filled up with coolwsd/jail whatever over 700GB restaring nginx, php-fpm ,redis did not remove tmp files I had to delete temp stuff manually then restarted services kindda disappointed

cableTh0rn avatar Aug 20 '24 07:08 cableTh0rn

Same issue here /tmp/ folder is filled up with coolwsd/jail whatever over 700GB restaring nginx, php-fpm ,redis did not remove tmp files I had to delete temp stuff manually then restarted services kindda disappointed

==> since i disabled the maintenance specifc run ('maintenance_window_start' => 100, (disabling it), the issue did not re-occur !?

(issue = long running cron & increasing /tmp/php stuff)

https://github.com/nextcloud/server/issues/47132

Githopp192 avatar Aug 20 '24 07:08 Githopp192

Same issue here /tmp/ folder is filled up with coolwsd/jail whatever over 700GB restaring nginx, php-fpm ,redis did not remove tmp files I had to delete temp stuff manually then restarted services kindda disappointed

==> since i disabled the maintenance specifc run ('maintenance_window_start' => 100, (disabling it), the issue did not re-occur !?

(issue = long running cron & increasing /tmp/php stuff)

nextcloud/server#47132

I run a check script to log the time tmp files gets bigger. It is just the maintenace_window time, 100GB in just one hour

''' Wed Aug 21 04:38:42 +03 2024 --> 1.9G /tmp/ Wed Aug 21 04:38:52 +03 2024 --> 1.9G /tmp/ Wed Aug 21 04:39:02 +03 2024 --> 1.9G /tmp/ Wed Aug 21 04:39:12 +03 2024 --> 1.9G /tmp/ Wed Aug 21 04:39:22 +03 2024 --> 1.9G /tmp/ Wed Aug 21 04:39:32 +03 2024 --> 1.9G /tmp/ Wed Aug 21 04:39:42 +03 2024 --> 1.9G /tmp/ Wed Aug 21 04:39:52 +03 2024 --> 1.9G /tmp/ Wed Aug 21 04:40:02 +03 2024 --> 1.9G /tmp/ Wed Aug 21 04:40:12 +03 2024 --> 2.3G /tmp/ Wed Aug 21 04:40:22 +03 2024 --> 2.9G /tmp/ Wed Aug 21 04:40:33 +03 2024 --> 3.5G /tmp/ Wed Aug 21 04:40:43 +03 2024 --> 3.7G /tmp/ Wed Aug 21 04:40:53 +03 2024 --> 3.7G /tmp/ Wed Aug 21 04:41:03 +03 2024 --> 3.7G /tmp/ Wed Aug 21 04:41:13 +03 2024 --> 3.7G /tmp/ Wed Aug 21 04:41:23 +03 2024 --> 3.8G /tmp/ Wed Aug 21 04:41:33 +03 2024 --> 3.7G /tmp/ Wed Aug 21 04:41:43 +03 2024 --> 3.7G /tmp/ Wed Aug 21 04:41:53 +03 2024 --> 3.7G /tmp/ Wed Aug 21 04:42:03 +03 2024 --> 3.7G /tmp/ Wed Aug 21 04:42:13 +03 2024 --> 3.8G /tmp/ Wed Aug 21 04:42:23 +03 2024 --> 3.8G /tmp/ Wed Aug 21 04:42:33 +03 2024 --> 4.1G /tmp/ Wed Aug 21 04:42:43 +03 2024 --> 4.5G /tmp/ Wed Aug 21 04:42:54 +03 2024 --> 4.8G /tmp/ Wed Aug 21 04:43:04 +03 2024 --> 5.8G /tmp/ Wed Aug 21 04:43:14 +03 2024 --> 7.2G /tmp/ Wed Aug 21 04:43:24 +03 2024 --> 8.0G /tmp/ Wed Aug 21 04:43:34 +03 2024 --> 9.2G /tmp/ ... Wed Aug 21 04:43:34 +03 2024 --> 9.2G /tmp/ Wed Aug 21 04:43:44 +03 2024 --> 11G /tmp/ Wed Aug 21 04:43:54 +03 2024 --> 12G /tmp/ Wed Aug 21 04:44:04 +03 2024 --> 13G /tmp/ Wed Aug 21 04:44:15 +03 2024 --> 14G /tmp/ Wed Aug 21 04:44:25 +03 2024 --> 15G /tmp/ Wed Aug 21 04:44:35 +03 2024 --> 16G /tmp/ Wed Aug 21 04:44:45 +03 2024 --> 17G /tmp/ Wed Aug 21 04:44:55 +03 2024 --> 18G /tmp/ Wed Aug 21 04:45:05 +03 2024 --> 19G /tmp/ Wed Aug 21 04:45:15 +03 2024 --> 20G /tmp/ Wed Aug 21 04:45:25 +03 2024 --> 21G /tmp/ Wed Aug 21 04:45:35 +03 2024 --> 21G /tmp/ Wed Aug 21 04:45:45 +03 2024 --> 22G /tmp/ Wed Aug 21 04:45:55 +03 2024 --> 23G /tmp/ Wed Aug 21 04:46:06 +03 2024 --> 25G /tmp/ Wed Aug 21 04:46:16 +03 2024 --> 26G /tmp/ Wed Aug 21 04:46:26 +03 2024 --> 27G /tmp/ Wed Aug 21 04:46:36 +03 2024 --> 28G /tmp/ Wed Aug 21 04:46:46 +03 2024 --> 28G /tmp/ Wed Aug 21 04:46:56 +03 2024 --> 30G /tmp/ Wed Aug 21 04:47:06 +03 2024 --> 31G /tmp/ Wed Aug 21 04:47:16 +03 2024 --> 32G /tmp/ Wed Aug 21 04:47:26 +03 2024 --> 33G /tmp/ Wed Aug 21 04:47:36 +03 2024 --> 34G /tmp/ Wed Aug 21 04:47:47 +03 2024 --> 35G /tmp/ Wed Aug 21 04:47:57 +03 2024 --> 36G /tmp/ Wed Aug 21 04:48:07 +03 2024 --> 36G /tmp/ Wed Aug 21 04:48:17 +03 2024 --> 37G /tmp/ ... Wed Aug 21 04:58:24 +03 2024 --> 97G /tmp/ Wed Aug 21 04:58:34 +03 2024 --> 98G /tmp/ Wed Aug 21 04:58:45 +03 2024 --> 99G /tmp/ Wed Aug 21 04:58:55 +03 2024 --> 99G /tmp/ Wed Aug 21 04:59:05 +03 2024 --> 99G /tmp/ Wed Aug 21 04:59:15 +03 2024 --> 99G /tmp/ Wed Aug 21 04:59:25 +03 2024 --> 99G /tmp/ Wed Aug 21 04:59:35 +03 2024 --> 99G /tmp/ Wed Aug 21 04:59:45 +03 2024 --> 99G /tmp/ Wed Aug 21 04:59:55 +03 2024 --> 99G /tmp/ Wed Aug 21 05:00:06 +03 2024 --> 99G /tmp/ Wed Aug 21 05:00:16 +03 2024 --> 99G /tmp/

cableTh0rn avatar Aug 21 '24 06:08 cableTh0rn

Same here, after updating to Nextcloud 29 from 28 on docker-compose stack. collabora/code:latest and nextcloud:stable from docker hub.

@bastien30

If you're experiencing this in collabora/code:latest than this isn't a richdocumentscode matter (which might be a very useful clue actually; at least if your situation has the same underlying cause as other reporters here).

joshtrichards avatar Aug 21 '24 13:08 joshtrichards

The logging in Built-in / richdocumentscode is just the same as default for CODE in general from the looks of it: warning level.

https://sdk.collaboraonline.com/docs/installation/Configuration.html?highlight=logging#logging

Since Built-in is mostly for testing and personal use, log rotation was likely not an original consideration.

Some of these reports here suggest very rapid log file growth. It would be helpful if one of you reporting this can inventory what precisely is showing up in the logs that is generating so much usage. Is it an unusual error/warning situation? Is it specific to certain environments? Do you have many users? etc.

joshtrichards avatar Aug 21 '24 13:08 joshtrichards

Same here, after updating to Nextcloud 29 from 28 on docker-compose stack. collabora/code:latest and nextcloud:stable from docker hub.

@bastien30

If you're experiencing this in collabora/code:latest than this isn't a richdocumentscode matter (which might be a very useful clue actually; at least if your situation has the same underlying cause as other reporters here).

Thanks, it seems the problem is gone for me, after uninstalling/reinstalling CODE application from within Nextcloud web interface.

bastien30 avatar Aug 21 '24 13:08 bastien30