qlcplus icon indicating copy to clipboard operation
qlcplus copied to clipboard

Scripts with systemcommand does not work properly

Open chille opened this issue 4 years ago • 3 comments
trafficstars

Note: This pull request is not yet ready for merge.

Each time a systemcommand in a script is processed, a new QProcess is allocated. This allocated object is never deleted. Except from leaking memory this does also have the implication that, after a while, the "Max open files" limit is reached on Linux systems. Each executed process seems to open 5 files (pipes) used for communication between Qt and the executed task. The default limit in Ubuntu 20.04 seems to be 1024 open files (Could be check with cat /proc/`pidof qlcplus`/limits and changed with prlimit -n4096 -p `pidof qlcplus` ). This will result in about 200 systemcommand executions before it will start to fail.

This bug (and the fix) can be verified by checking the number of open file descriptors for the QLC+ process: ls -1 /proc/`pidof qlcplus`/fd | wc -l

This bug could be patched in three different ways

1) The blocking way

This is what I have done for now. This will block the QLC+ engine until the script is finnished. This will also freeze the UI for up to two seconds. This is nothing more than a proof of concept.

QProcess *newProcess = new QProcess();
newProcess->start(programName, programArgs);
newProcess->waitForFinished();
delete newProcess;

2) The non blocking way - Qt event loop

In theory it should be possible to let the Qt event loop take care of destroying the QProcess as soon as it is finnished. However, this does not work for some reason. I would guess that the Qt event loop is not inherited correctly because somewhere a QObject is created without providing a parent. I have not yet been able to track down if this is the case.

I would assume that this is the prefered way of fixing this bug.

QProcess *newProcess = new QProcess(this);
newProcess->start(programName, programArgs);
newProcess->deleteLater();

3) The non blocking way - Workaround

If number 2 is not feasible, then another solution would be to make the engine to the cleanup, or make an object that destroys itself.

More things to consider

  • scriptrunner.cpp needs to be updated accordingly
  • Maybe it is a good idea to be able to choose if a systemcommand should be blocking or not?
  • The process is executed with QProcess::start(), maybe it is better to use QProcess::startDetached() ?

chille avatar Feb 09 '21 17:02 chille

Another quick and dirty way to fix the "Max open files" problem is to use QProcess::startDetached() instead of QProcess::start(). This will solve the problem with the stuck pipes, but the memory leak will still be there.

chille avatar Feb 09 '21 17:02 chille

Quick hint about draft-PRs when they're not yet ready to be merged. 😉

Lichtschalter-5000 avatar Feb 09 '21 21:02 Lichtschalter-5000

Quick hint about draft-PRs when they're not yet ready to be merged. wink

Didn't know about that feature! I have converted it to a draft now. Thanks!

chille avatar Feb 10 '21 01:02 chille

Fixed by #1358

mcallegari avatar Aug 25 '22 19:08 mcallegari