Restart worker after a given number of tasks
I understand that the worker implementation is fairly basic at the moment. When using it to produce pdf files I have noticed that memory usage creeps up over time and is never freed up.
When django is running with gunicorn this can be mitigated by restarting the workers with max-requests and max-requests-jitter.
Are there any plans to add something like this? Or any other solutions? To deal with this right now I am running these heavy tasks as separate subprocesses to keep them isolated and free up the memory asap after running.
I think linking directly in to memory usage is unlikely, but I think a feature to limit the number of tasks could be helpful. There's --batch which implements a part of this functionality, but something akin to --max-tasks could be helpful.
I think for measuring memory usage, this should be handled by the outside infrastructure (ie whatever is running db_worker).