papermerge icon indicating copy to clipboard operation
papermerge copied to clipboard

Migration from 2.0 to 3.4

Open ciur opened this issue 11 months ago • 3 comments

Is there anybody interested in migrating from 2.0 to 3.4 ? If yes, now is the moment to raise your hand. If there will be no interest until June 2025, I will close this topic.

ciur avatar Feb 02 '25 16:02 ciur

Hey Ciur I'm interested and I'm facing the migration problem. My instance is running in a Docker container. Unfortunately, I no longer have the current instance running. I would like to install an old version with the migration from the old version (2.0).

I have the current 2.0 file structure: /volume1/docker/papermerge/data folder incl. papermerge.db and the /volume1/docker/papermerge/config folder.

I started with the two old directories present:

I generate:

echo 'SECRET_KEY="--MY_SECRET_KEY--"' > /volume1/docker/papermerge/config/.env echo 'DB_ENGINE=sqlite3' >> /volume1/docker/papermerge/config/.env echo 'DB_NAME=/data/db.sqlite3' >> /volume1/docker/papermerge/config/.env echo 'TZ=Europe/Berlin' >> /volume1/docker/papermerge/config/.env echo 'PUID=1026' >> /volume1/docker/papermerge/config/.env echo 'PGID=100' >> /volume1/docker/papermerge/config/.env

docker run -d --name=papermerge
-p 7423:8000
--env-file /volume1/docker/papermerge/config/.env
-v /volume1/docker/papermerge/config:/config
-v /volume1/docker/papermerge/data:/data
--restart=always
papermerge/papermerge:latest

Here i have to add the SECRET_KEY to Django: cho 'import os' >> /app/config/settings.py echo 'SECRET_KEY = os.getenv("SECRET_KEY", "--MY_SECRET_KEY--")' >> /app/config/settings.py

Without SECRET_KEY i get the error at the login: SyntaxError: JSON.parse: unexpected character at line 1 column 1 of the JSON data

--HERE WORKS THE LOGIN FINE IN WEB GUI, BUT WITHOUT MY DATA--

I have executed to miration: python manage.py makemigrations python manage.py migrate

At the settings.py i found DATABASES = config.get_django_databases(proj_root=PROJ_ROOT)

from here on I need some help :)

Support would be nice here. Thanks in advance.

CT-IT avatar Feb 24 '25 13:02 CT-IT

@CT-IT

Could you give it a try with pmdump v0.1 ?

At this moment instructions are only in README.md file, here: https://github.com/papermerge/pmdump In short, you need to create a source.yaml file which will instruct pmdump where to copy data from. Example of source.yaml:

app_version: 2.0
media_root: /home/eugen/DockerCompose/linux-server-pmg-2/data/media
database_url: sqlite:///home/eugen/DockerCompose/linux-server-pmg-2/data/papermerge.db

Then you need to:

pmdump -c source.yaml -f pmg.tar.gz export

Which will create pmg.tar.gz file with your data.

Then create a target.yaml:

media_root: /home/eugen/DockerCompose/pm3.4-pg/media/
database_url: postgresql://coco:[email protected]:5432/pmg34?sslmode=disable
app_version: 3.4

And finally run command (pmg.tar.gz file is the one created by export command):

pmdump -c target.yaml -f pmg.tar.gz import

Note that only users, documents and folders will be exported.

For testing import into the target, I've tested with following docker compose:

services:
  webapp:
    image: papermerge/papermerge:3.4
    environment:
      PAPERMERGE__SECURITY__SECRET_KEY: 12345
      PAPERMERGE__AUTH__USERNAME: admin
      PAPERMERGE__AUTH__PASSWORD: admin
      PAPERMERGE__DATABASE__URL: postgresql://coco:kesha@db:5432/pmg34a4
    ports:
     - "12034:80"
    volumes:
      - ${PWD}/media34a4:/core_app/media
    depends_on:
      - db
  db:
    image: postgres:16.1
    restart: always
    environment:
      - POSTGRES_PASSWORD=kesha
      - POSTGRES_DB=pmg34a4
      - POSTGRES_USER=coco
    volumes:
      - pg_data34a4:/var/lib/postgresql/data   # Store database data
    ports:
      - 5432:5432

volumes:
  pg_data34a4:

Notice that local media folder (which should be accessible to pmdump) is mounted unto /core_app/media in papermerge/papermerge:3.4 container.

ciur avatar Mar 08 '25 08:03 ciur

Ok, so I've managed to convert my datastore from 2.0 to 3.4... and that worked just fine except for 4 documents for which I had to fix the database by hand.

My problem now is: none of the documents seem to have OCR text included, and the GUI doesn't seem to have a way to do mass-queue-for-OCR?

seaeagle1 avatar Apr 02 '25 21:04 seaeagle1