paperless
paperless copied to clipboard
Paperles and Synology or even on a Pi
Hi
as I own a Synolgy NAS which supports from base the docker app I tried to get Paperless working on it. Unfortunately the container is just starting and stopping and nothing more. I tried manually over SSH and docker-compose but also without success. Could it be incompatible? Does anyone maybe had success with such an install?
On another point I’m thinking about reconfiguring my Raspberry Pi and try it there but will it work as it is ARM technology?
Thanks already for any feedbacks
here the error I at least get when running docker-compose manually:
Top level object in docker-compose.yml needs to be an object not '<type 'str'>'
Top level object in docker-compose.yml needs to be an object not '<type 'str'>'
That sounds like your docker-compose file is misconfigured and can't be read by docker-compose. It should look something like this.
version: '3'
services:
# ...
Maybe you're using the wrong docker version? I'm not a docker guru myself, so I'm not sure on that. However, I have paperless running via docker on a Raspberry Pi 3, using the OpenMediaVault os. The main thing you have to change is the base image from where the docker container gets built. In ./Dockerfile, change the first line from
FROM alpine:3.8
to
FROM arm32v6/alpine
That should be it.
Do you try to tun paperless directly from the Synology Docker UI? Personally I do it and it's working well :-) on a DS918+.
For a Pi you probably should compile an arm version as @khrise say :-)
What settings/installation process did you use to set it up on your Synology, @sbrunner? I get the same error as OP (the container just starts and stops). Tnx.
I'd love to see a more detailed guide to the setup on Synology as well! I haven't been able to work it out.
Thoes one:
- https://github.com/sbrunner/scan-to-paperless/blob/master/danielquinn-paperless-webserver.syno.json
- https://github.com/sbrunner/scan-to-paperless/blob/master/danielquinn-paperless-consumer.syno.json
Thanks very much for the configs @sbrunner! Would you be able to elaborate a little bit on how to use them? I'm a complete beginner when it comes to Docker.
What I have found out so far:
-
the config structure matches what you get when you export the Paperless config from Docker as below
-
surely there is a better of way of changing it then importing it again, right?
-
the reason that the default image does not do anything is that it just runs
"cmd" : "--help"
and you can see that in the logs as well. It just prints whatevermanage.py --help
shows.
Now I'm wondering. Are you running two Paperless docker images? One for the consumer and one for the webserver? And how can you update the config files on the fly? I had look through ssh but couldn't find anything.
Thanks again!
- In the
volume_bindings
you should change them to be folder available on your nas. - To import then on your screenshot you are at the right place, just select Import in the menu.
- Yes i use tow different docker container (on the same image to use the Docker language :-) )
- Witch config file? The documents will be in my case on the folder
/data/sync/Documents Stéphane Laetitia/paperless/media
and the database in the folder/data/sync/Documents Stéphane Laetitia/paperless/data
but you probably will change them...
I'm trying this to setup with the docker setup. Thank you @sbrunner to make this available in the UI! I've set up the right config ( see my fork ) But somehow the consumer is eating the documents ( documents are deleted ). But they don't show up in the webserver. Am I missing a step?
I'm trying this to setup with the docker setup. Thank you @sbrunner to make this available in the UI! I've set up the right config ( see my fork ) But somehow the consumer is eating the documents ( documents are deleted ). But they don't show up in the webserver. Am I missing a step?
I'd be interested in the exact same thing. I did the same steps and only changed the mount paths. My containers start, but not even a document is read. Nothing happens at all.
I got this working.
Not sure what the problem was. But it had to do with the mounting setup or folder permissions or something like that. But it is a while ago.
Here are my exported config files ( with scrambled id's ) Hope it helps. Consumer
{
"cap_add" : [],
"cap_drop" : [],
"cmd" : "document_consumer",
"cpu_priority" : 50,
"devices" : null,
"enable_publish_all_ports" : false,
"enable_restart_policy" : true,
"enabled" : true,
"entrypoint_default" : "/sbin/docker-entrypoint.sh",
"env_variables" : [
{
"key" : "PAPERLESS_TIME_ZONE",
"value" : "Europe/Amsterdam"
},
{
"key" : "PAPERLESS_FILENAME_DATE_ORDER",
"value" : "DMY"
},
{
"key" : "PATH",
"value" : "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
},
{
"key" : "PAPERLESS_EXPORT_DIR",
"value" : "/export"
},
{
"key" : "PAPERLESS_CONSUMPTION_DIR",
"value" : "/consume"
},
{
"key" : "PAPERLESS_CONVERT_MEMORY_LIMIT",
"value" : "32000000"
},
{
"key" : "USERMAP_UID",
"value" : "114670"
},
{
"key" : "USERMAP_GID",
"value" : "104555"
},
{
"key" : "PAPERLESS_OCR_LANGUAGES",
"value" : "nld eng"
},
{
"key" : "PAPERLESS_OCR_THREADS",
"value" : "4"
},
{
"key" : "PAPERLESS_FORGIVING_OCR",
"value" : "true"
}
],
"exporting" : false,
"id" : "<Some id>",
"image" : "danielquinn/paperless:latest",
"is_ddsm" : false,
"is_package" : false,
"links" : [],
"memory_limit" : 0,
"name" : "danielquinn-paperless-consumer",
"network" : [
{
"driver" : "bridge",
"name" : "bridge"
}
],
"network_mode" : "bridge",
"port_bindings" : [],
"privileged" : false,
"shortcut" : {
"enable_shortcut" : false
},
"ulimits" : null,
"use_host_network" : false,
"volume_bindings" : [
{
"host_volume_file" : "/SmartHome/paperless/consume",
"mount_point" : "/consume",
"type" : "rw"
},
{
"host_volume_file" : "/SmartHome/paperless/media",
"mount_point" : "/usr/src/paperless/media",
"type" : "rw"
},
{
"host_volume_file" : "/SmartHome/paperless/data",
"mount_point" : "/usr/src/paperless/data",
"type" : "rw"
}
],
"volumes_from" : null
}
Webserver:
{
"cap_add" : [],
"cap_drop" : [],
"cmd" : "runserver --insecure --noreload 0.0.0.0:8000",
"cpu_priority" : 50,
"devices" : null,
"enable_publish_all_ports" : false,
"enable_restart_policy" : true,
"enabled" : true,
"entrypoint_default" : "/sbin/docker-entrypoint.sh",
"env_variables" : [
{
"key" : "PAPERLESS_DEBUG",
"value" : "false"
},
{
"key" : "PAPERLESS_TIME_ZONE",
"value" : "Europe/Amsterdam"
},
{
"key" : "PAPERLESS_FILENAME_DATE_ORDER",
"value" : "DMY"
},
{
"key" : "PAPERLESS_ALLOWED_HOSTS",
"value" : "192.168.1.23,localhost"
},
{
"key" : "PATH",
"value" : "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
},
{
"key" : "PAPERLESS_EXPORT_DIR",
"value" : "/export"
},
{
"key" : "PAPERLESS_CONSUMPTION_DIR",
"value" : "/consume"
},
{
"key" : "USERMAP_UID",
"value" : "114670"
},
{
"key" : "USERMAP_GID",
"value" : "104555"
},
{
"key" : "PAPERLESS_DISABLE_LOGIN",
"value" : "false"
},
{
"key" : "PAPERLESS_LIST_PER_PAGE",
"value" : "100"
},
{
"key" : "PAPERLESS_INLINE_DOC",
"value" : "true"
}
],
"exporting" : false,
"id" : "<some id>",
"image" : "danielquinn/paperless:latest",
"is_ddsm" : false,
"is_package" : false,
"links" : [],
"memory_limit" : 0,
"name" : "danielquinn-paperless-webserver",
"network" : [
{
"driver" : "bridge",
"name" : "bridge"
}
],
"network_mode" : "bridge",
"port_bindings" : [
{
"container_port" : 8000,
"host_port" : 8000,
"type" : "tcp"
}
],
"privileged" : false,
"shortcut" : {
"enable_shortcut" : false
},
"ulimits" : null,
"use_host_network" : false,
"volume_bindings" : [
{
"host_volume_file" : "/SmartHome/paperless/data",
"mount_point" : "/usr/src/paperless/data",
"type" : "rw"
},
{
"host_volume_file" : "/SmartHome/paperless/media",
"mount_point" : "/usr/src/paperless/media",
"type" : "rw"
}
],
"volumes_from" : null
}
There were two gotchas I remember from setting it up on Synology Docker:
- all mounted directories have to be encrypted or all directories not encrypted, it did not work when for example the data directory was encrypted and the consumption directory wasn't. One issue I have seen with Docker is that it nukes the directory permissions when something is wrong here. When you look at the permissions tab under
control panel -> shared folder -> edit -> permissions
nothing is allowed access after the container is started. - the command had to be changed in the container. By default it wasn't actually doing anything. From what I remember you will see that in the container log output.
Thank you so much guys for your help, I finally got this working. This is what I needed to do in addition:
-
Make sure that the webserver is actually executing
runserver --insecure --noreload 0.0.0.0:8000
, as shown above by @scubafly. At first my containers kept stopping immediately after I started them because I didn't define a command. -
Set the
USERMAP_UID
andUSERMAP_GID
according to the user you want to use paperless with. Find that out withid <your-user-account>
, so e.g.id michael
. -
After starting the webserver and making sure that it is running, I had to manually open another bash session (having selected the docker container, click on
Detail
,Terminal
,Create
. Might as well work with a ssh session as well, of course) and set permissions for the consume-directory:sudo chown -R paperless:paperless /consume
andsudo chmod -R 777 /consume
Then start the consumer container and drop some files to your mounted consumer directory. I guess this is what @ddddavidmartin said, I found everything else in this issue.
It's working like a charm now. Again, thank you for your help.
I still have no clue what is going on here with mine. I constantly get a reboot of the container.
I've downloaded (in the Synology Docker GUI) the danielquinn/paperless container. Started it with these folders/environments
/docker/paperless/consume = /consume /docker/paperless/data = /data /docker/paperless/media = /media /docker/paperless/data = /export
PAPERLESS_OCR_LANGUAGES = nld eng
Start it, but it keeps rebooting. I find trouble shooting a pain! Almost everything on the web is in config files or in terminal commands. But I only use the normal Synology Docker GUI. I try to extract the paths and environments from the configs... but I get lost again when I see the consumer and webserver configs from @scubafly. There is two now?
Same Problem here. Start on Ubuntu-based Docker-Host no Problem.. on Synology reboots without error-message..
- In the
volume_bindings
you should change them to be folder available on your nas.- To import then on your screenshot you are at the right place, just select Import in the menu.
- Yes i use tow different docker container (on the same image to use the Docker language :-) )
- Witch config file? The documents will be in my case on the folder
/data/sync/Documents Stéphane Laetitia/paperless/media
and the database in the folder/data/sync/Documents Stéphane Laetitia/paperless/data
but you probably will change them...
@sbrunner - wondering if you can help me out. I'm trying to do the same as @scubafly and the Docker image keeps crashing. I tried copying your config files but it doesn't seem to be working. Looking at the logs, it appears to be something with how python is reading the file? This is all new to me, so maybe I'm just in over my head. I have Synology DS412+. Any guidance you can give would be appreciated! This is just the consumer:
Here is the log:
`
date | stream | content |
---|---|---|
2020-06-01 02:13:02 | stdout | |
2020-06-01 02:13:02 | stdout | connections on Unix domain socket "/tmp/.s.PGSQL.5432"? |
2020-06-01 02:13:02 | stdout | Is the server running locally and accepting |
2020-06-01 02:13:02 | stdout | django.db.utils.OperationalError: could not connect to server: No such file or directory |
2020-06-01 02:13:02 | stdout | conn = _connect(dsn, connection_factory=connection_factory, **kwasync) |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/psycopg2/init.py", line 126, in connect |
2020-06-01 02:13:02 | stdout | connection = Database.connect(**conn_params) |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/backends/postgresql/base.py", line 168, in get_new_connection |
2020-06-01 02:13:02 | stdout | self.connection = self.get_new_connection(conn_params) |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/backends/base/base.py", line 194, in connect |
2020-06-01 02:13:02 | stdout | self.connect() |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/backends/base/base.py", line 216, in ensure_connection |
2020-06-01 02:13:02 | stdout | raise dj_exc_value.with_traceback(traceback) from exc_value |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/utils.py", line 89, in exit |
2020-06-01 02:13:02 | stdout | self.connect() |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/backends/base/base.py", line 216, in ensure_connection |
2020-06-01 02:13:02 | stdout | self.ensure_connection() |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/backends/base/base.py", line 232, in _cursor |
2020-06-01 02:13:02 | stdout | return self._cursor() |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/backends/base/base.py", line 255, in cursor |
2020-06-01 02:13:02 | stdout | return self.Migration._meta.db_table in self.connection.introspection.table_names(self.connection.cursor()) |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/migrations/recorder.py", line 44, in has_table |
2020-06-01 02:13:02 | stdout | if self.has_table(): |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/migrations/recorder.py", line 61, in applied_migrations |
2020-06-01 02:13:02 | stdout | self.applied_migrations = recorder.applied_migrations() |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/migrations/loader.py", line 207, in build_graph |
2020-06-01 02:13:02 | stdout | self.build_graph() |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/migrations/loader.py", line 49, in init |
2020-06-01 02:13:02 | stdout | self.loader = MigrationLoader(self.connection) |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/migrations/executor.py", line 18, in init |
2020-06-01 02:13:02 | stdout | executor = MigrationExecutor(connection, self.migration_progress_callback) |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/core/management/commands/migrate.py", line 79, in handle |
2020-06-01 02:13:02 | stdout | output = self.handle(*args, **options) |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/core/management/base.py", line 335, in execute |
2020-06-01 02:13:02 | stdout | self.execute(*args, **cmd_options) |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/core/management/base.py", line 288, in run_from_argv |
2020-06-01 02:13:02 | stdout | self.fetch_command(subcommand).run_from_argv(self.argv) |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/core/management/init.py", line 365, in execute |
2020-06-01 02:13:02 | stdout | utility.execute() |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/core/management/init.py", line 371, in execute_from_command_line |
2020-06-01 02:13:02 | stdout | execute_from_command_line(sys.argv) |
2020-06-01 02:13:02 | stdout | File "/usr/src/paperless/src/manage.py", line 11, in |
2020-06-01 02:13:02 | stdout | Traceback (most recent call last): |
2020-06-01 02:13:02 | stdout | |
2020-06-01 02:13:02 | stdout | The above exception was the direct cause of the following exception: |
2020-06-01 02:13:02 | stdout | |
2020-06-01 02:13:02 | stdout | |
2020-06-01 02:13:02 | stdout | connections on Unix domain socket "/tmp/.s.PGSQL.5432"? |
2020-06-01 02:13:02 | stdout | Is the server running locally and accepting |
2020-06-01 02:13:02 | stdout | psycopg2.OperationalError: could not connect to server: No such file or directory |
2020-06-01 02:13:02 | stdout | conn = _connect(dsn, connection_factory=connection_factory, **kwasync) |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/psycopg2/init.py", line 126, in connect |
2020-06-01 02:13:02 | stdout | connection = Database.connect(**conn_params) |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/backends/postgresql/base.py", line 168, in get_new_connection |
2020-06-01 02:13:02 | stdout | self.connection = self.get_new_connection(conn_params) |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/backends/base/base.py", line 194, in connect |
2020-06-01 02:13:02 | stdout | self.connect() |
2020-06-01 02:13:02 | stdout | File "/usr/lib/python3.8/site-packages/django/db/backends/base/base.py", line 216, in ensure_connection |
2020-06-01 02:13:02 | stdout | Traceback (most recent call last): |
2020-06-01 02:12:53 | stdout | if delimiter is -1: |
2020-06-01 02:12:53 | stdout | /usr/src/paperless/src/documents/models.py:342: SyntaxWarning: "is" with a literal. Did you mean "=="? |
2020-06-01 02:12:53 | stdout | if delimiter is -1: |
2020-06-01 02:12:53 | stdout | /usr/src/paperless/src/documents/models.py:339: SyntaxWarning: "is" with a literal. Did you mean "=="? |
` Here is my config:
{ "cap_add" : null, "cap_drop" : null, "cmd" : "document_consumer", "cpu_priority" : 50, "devices" : null, "enable_publish_all_ports" : false, "enable_restart_policy" : false, "enabled" : true, "entrypoint_default" : "/sbin/docker-entrypoint.sh", "env_variables" : [ { "key" : "PATH", "value" : "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" }, { "key" : "PAPERLESS_EXPORT_DIR", "value" : "/export" }, { "key" : "PAPERLESS_CONSUMPTION_DIR", "value" : "/consume" }, { "key" : "PAPERLESS_DBUSER", "value" : "paperless" }, { "key" : "PAPERLESS_DBPASS", "value" : "1234" } ], "exporting" : false, "id" : "12757fd5e8a2ea52b8f73c045175d32fd4b702b38a3c7e7c794b373e46d63fbb", "image" : "thepaperlessproject/paperless:latest", "is_ddsm" : false, "is_package" : false, "links" : [], "memory_limit" : 536870912, "memory_limit_slider" : 512, "name" : "Paperless-Consumer", "network" : [ { "driver" : "bridge", "name" : "bridge" } ], "network_mode" : "bridge", "port_bindings" : [], "privileged" : true, "shortcut" : { "enable_shortcut" : false }, "use_host_network" : false, "volume_bindings" : [ { "host_volume_file" : "/Documents/media", "mount_point" : "/usr/src/paperless/media", "type" : "rw" }, { "host_volume_file" : "/Documents/consume", "mount_point" : "/consume", "type" : "rw" }, { "host_volume_file" : "/Documents/export", "mount_point" : "/export", "type" : "rw" }, { "host_volume_file" : "/Documents/dumpdata", "mount_point" : "/dumpdata", "type" : "rw" } ] }
It looks that you should update your volumes path...
Thanks - that made me realize I wasn't doing the permissions properly. Folders were correct, but I didn't do the UID and GID correctly. Looks like I got that right now, so now it looks to be working! Now to play around. Thanks again