synapse icon indicating copy to clipboard operation
synapse copied to clipboard

High CPU and memory usage

Open daniilrozanov opened this issue 2 months ago • 7 comments

Sorry to ask this question here, but I can't even connect (or can, but it's slow) to my remote server over SSH because of this issue.

I'm new to matrix and especially synapse, and here's my problem: When synapse is running, it uses about 100% of the CPU and >80% of the RAM (1.6GB).

Also, my nginx logs indicate that it receives PUT /_matrix/federation/v1/send/<number> ~30-50 times per minute, but I do not know if this could be related.

Few examples:

<some ip> - - [12/Oct/2025:09:03:22 +0000] "PUT /_matrix/federation/v1/send/<number> HTTP/1.1" 502 157 "-" "Synapse/1.134.0"
<another ip> - - [12/Oct/2025:09:03:39 +0000] "PUT /_matrix/federation/v1/send/<another number> HTTP/1.1" 200 21 "-" "Synapse/1.117.0"
<ip> - - [12/Oct/2025:09:04:47 +0000] "PUT /_matrix/federation/v1/send/<number> HTTP/1.1" 401 197 "-" "Synapse/1.136.0"

I have a pretty weak VPS with 1 core and 2 GB of RAM but I hope it's enough to run matrix node. How can I understand why synapse consumes so much memory and can I reduce it?

Homeserver settings:

pid_file: "/var/run/matrix-synapse.pid"
listeners:
  - port: 8008
    tls: false
    type: http
    x_forwarded: true
    bind_addresses: ['127.0.0.1']
    resources:
      - names: [client, federation]
        compress: false
database:
  name: psycopg2
  txn_limit: 10000
  args:
    user: synapse_user
    password: SECRET
    database: synapse
    host: localhost
    port: 5432
    cp_min: 5
    cp_max: 10
log_config: "/etc/matrix-synapse/log.yaml"
media_store_path: /var/lib/matrix-synapse/media
signing_key_path: "/etc/matrix-synapse/homeserver.signing.key"
trusted_key_servers:
  - server_name: MYDOMAIN
suppress_key_server_warning: true
max_upload_size: 10M
enable_registration: false
turn_uris: ["turn:matrix.MYDOMAIN?transport=udp","turn:matrix.MYDOMAIN?transport=tcp"]
turn_shared_secret: SECRET
turn_user_lifetime: 86400000
registration_shared_secret: SECRET
macaroon_secret_key: SECRET
allow_public_rooms_over_federation: true
allow_public_rooms_without_auth: true

Nginx:

# matrix server
server {
    listen 443 ssl;
    listen [::]:443 ssl;
    listen 8448 ssl default_server;
    listen [::]:8448 ssl default_server;

    server_name matrix.MYDOMAIN;

    # ssl settings omitted

    location / {
        proxy_pass http://localhost:8008;
        proxy_set_header X-Forwarded-For $remote_addr;
    }

    location ~ ^(/_matrix|/_synapse/client) {
        proxy_pass http://localhost:8008;
        proxy_set_header X-Forwarded-For $remote_addr;
        proxy_set_header X-Forwarded-Proto $scheme;
        proxy_set_header Host $host:$server_port;
        client_max_body_size 11M;
        proxy_http_version 1.1;
    }
}
# delegation
server {
    listen [::]:443 ssl ipv6only=on;
    listen 443 ssl;
    server_name MYDOMAIN;
    # ssl settings omitted
    location /.well-known/matrix/client {
        return 200 '{"m.homeserver": {"base_url": "https://matrix.MYDOMAIN"}}';
        default_type application/json;
        add_header Access-Control-Allow-Origin *;
    }
    
    location /.well-known/matrix/server {
        return 200 '{"m.server": "matrix.MYDOMAIN:443"}';
        default_type application/json;
        add_header Access-Control-Allow-Origin *;
    }
}

EDIT There are suspiciously many reqords in my synapse logs, and they are growing rapidly. Within one second, 1,700 records were made, and this happens constantly while synapse is running. I'm not sure can I share my logs but it have INFO level and here is some records:

2025-10-12 09:23:57,584 - synapse.http.matrixfederationclient - 369 - INFO - sync_partial_state_room-0-$YmV3eJuqhITkAkQCMcvkvNyUERkkFyOIbQczHMGF-$mxvTsNoMRXfQNtOUCU1nYZiXFsmw7B3CQ-h2XnJi - {POST-O-6360} [beta2.matrix.org] Completed request: 200 OK in 19.61 secs, got 24 bytes - POST matrix-federation://beta2.matrix.org/_matrix/policy/unstable/org.matrix.msc4284/event/$LyvSnufVzWjM3ClafRPctwIXxfv4ZeS7uOXjGG5fTWE/check
Thousands of that in a few seconds, tens of thousands across all log...
2025-10-12 09:40:58,524 - synapse.federation.federation_client - 834 - WARNING - sync_partial_state_room-0-$YmV3eJuqhITkAkQCMcvkvNyUERkkFyOIbQczHMGF-$mxvTsNoMRXfQNtOUCU1nYZiXFsmw7B3CQ-h2XnJi - Signature on retrieved event $m_01OyOe0-BV3OSTHlZG6YC4bIFOrqhJ8KdCjAt was invalid (unable to verify signature for sender domain erb.pw: 401: Failed to find any key to satisfy: _FetchKeyRequest(server_name='erb.pw', minimum_valid_until_ts=1698939776467, key_ids=['ed25519:a_WpbR'])). Checking local store/origin server
This kind of record happens very frequently as well

daniilrozanov avatar Oct 12 '25 08:10 daniilrozanov

My instance is currently using 3.5gb of ram and I had to limit the cpu usage in compose. Seems unnecessarily high

adamef93 avatar Oct 13 '25 16:10 adamef93

@adamef93 Is this acceptable to use >2gb of ram for my small, cozy matrix server, where I am the only user? + Have you checked your synapse's logs? Do you have a tons of strange identical records like I showed above, after EDIT section?

daniilrozanov avatar Oct 14 '25 18:10 daniilrozanov

I'm set up the same way. It's just me and I only use Matrix for Messenger with plugin. No clue why it's using so much

adamef93 avatar Oct 14 '25 20:10 adamef93

Seeing similar issues on another homeserver, where 2 particular workers are eating up a comfortable 10GB (in particular: federation request workers) I cannot reproduce on my own personal homeserver, as all of my federation reader (equivalent in my setup) are amongst the lowest memory users, using only 5.6 GB, which is considered a tiny amount for my synapse configuration.

EDIT: I should note that this is a recent issue and has been causing OOMKills on the homeserver in questoin

TheArcaneBrony avatar Oct 25 '25 14:10 TheArcaneBrony

Observing the same thing here with beta2.matrix.org generating litteral gigabytes of traffic towards my server. Confirmed this through a PCAP

altf4arnold avatar Nov 02 '25 19:11 altf4arnold

Observing the same thing here with beta2.matrix.org generating litteral gigabytes of traffic towards my server. Confirmed this through a PCAP

Can you tell if the traffic is being sent in /send/ requests, or if its something else?

We should be limiting the amount of data sent in any given /send/ request, however on receipt Synapse may need to go and fetch state (via /state/ from other servers and then process that. For very large rooms this can, in pathological cases, take many GBs of memory.

In the logs you may see log lines like /state returned N events, that would indicate if you were hitting the above problem.

We also track how long we spend doing state res, via the following lines:

2025-11-03 08:36:31,624 - synapse.state.metrics - 884 - DEBUG - looping_call - 1 biggest rooms for state-res by CPU time: ['!NasysSDfxKxZBzJJoE:matrix.org (0.377183s)']
2025-11-03 08:36:31,624 - synapse.state.metrics - 884 - DEBUG - looping_call - 1 biggest rooms for state-res by DB time: ['!NasysSDfxKxZBzJJoE:matrix.org (23.1567s)']

erikjohnston avatar Nov 03 '25 09:11 erikjohnston

There is my logs. I've deleted all previous log files and ran synapse for 2 minutes. 19MB homeserver.log!!

Hope there is no credentials here :)

Nginx's error.log Nginx's access.log Synapse's homeserver.log

daniilrozanov avatar Nov 03 '25 14:11 daniilrozanov