plex_debrid icon indicating copy to clipboard operation
plex_debrid copied to clipboard

Is there a way to run this in docker daemon

Open trulow opened this issue 2 years ago • 1 comments

Is there a way to run the docker version in daemon mode once the script is configured properly?

I've tried using the following run command but the script crashes upon launching

sudo docker run -d \ --name=plex_debrid \ --net=host \ -v ~/.rclone_rd/config:/config \ itstoggle/plex_debrid

Logs

[11/30/23 21:58:53] checking new content ... done
[11/30/23 21:58:53] scraping sources [torrentio,jackett] for query "REDACTED_TITLE" ... done
[11/30/23 21:58:53] error starting new thread (perhaps maximum number of threads reached), will retry in 5 seconds and exit if it fails again.
Exception in thread Thread-1 (threaded):
Traceback (most recent call last):
  File "/scraper/__init__.py", line 26, in scrape
    t.start()
  File "/usr/local/lib/python3.12/threading.py", line 971, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't create new thread at interpreter shutdown

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.12/threading.py", line 1052, in _bootstrap_inner
    self.run()
  File "/usr/local/lib/python3.12/threading.py", line 989, in run
    self._target(*self._args, **self._kwargs)
  File "/ui/__init__.py", line 474, in threaded
    element.download(library=library)
  File "/content/classes.py", line 1201, in download
    self.Releases += scraper.scrape(self.query(title).replace(
  File "/scraper/__init__.py", line 30, in scrape
    t.start()
  File "/usr/local/lib/python3.12/threading.py", line 971, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't create new thread at interpreter shutdown

While a client is connected to container's stdio using docker attach, Docker uses a ~1MB memory buffer to maximize the throughput of the application. Once this buffer is full, the speed of the API connection is affected, and so this impacts the output process' writing speed. This is similar to other applications like SSH. Because of this, it is not recommended to run performance critical applications that generate a lot of output in the foreground over a slow client connection. Instead, users should use the docker logs command to get access to the logs.

Source

trulow avatar Nov 30 '23 19:11 trulow

@trulow were you able to figure this out?

utkarshsethi avatar May 10 '24 23:05 utkarshsethi