dawarich icon indicating copy to clipboard operation
dawarich copied to clipboard

sidekiq crashed during large userdata-export

Open Freika opened this issue 2 months ago • 0 comments

Discussed in https://github.com/Freika/dawarich/discussions/1838

Originally posted by wunderbaum October 8, 2025 I am using version 0.33.1

I try to do a user-data export, but it fails always. With docker logs dawarcih_sidekiq -f I can see the export is starting, for example the "points". And that seems to make trouble:

  ESC[1mESC[36mPoints Export (36815.7ms)ESC[0m  ESC[1mESC[34m      SELECT
        p.id, p.battery_status, p.battery, p.timestamp, p.altitude, p.velocity, p.accuracy,
        p.ping, p.tracker_id, p.topic, p.trigger, p.bssid, p.ssid, p.connection,
        p.vertical_accuracy, p.mode, p.inrids, p.in_regions, p.raw_data,
        p.city, p.country, p.geodata, p.reverse_geocoded_at, p.course,
        p.course_accuracy, p.external_track_id, p.created_at, p.updated_at,
        p.lonlat, p.longitude, p.latitude,
        -- Extract coordinates from lonlat if individual fields are missing
        COALESCE(p.longitude, ST_X(p.lonlat::geometry)) as computed_longitude,
        COALESCE(p.latitude, ST_Y(p.lonlat::geometry)) as computed_latitude,
        -- Import reference
        i.name as import_name,
        i.source as import_source,
        i.created_at as import_created_at,
        -- Country info
        c.name as country_name,
        c.iso_a2 as country_iso_a2,
        c.iso_a3 as country_iso_a3,
        -- Visit reference
        v.name as visit_name,
        v.started_at as visit_started_at,
        v.ended_at as visit_ended_at
      FROM points p
      LEFT JOIN imports i ON p.import_id = i.id
      LEFT JOIN countries c ON p.country_id = c.id
      LEFT JOIN visits v ON p.visit_id = v.id
      WHERE p.user_id = $1
      ORDER BY p.id
ESC[0m  [[nil, 2]]
  ↳ app/services/users/export_data/points.rb:40:in 'Users::ExportData::Points#call'
/var/app/vendor/bundle/ruby/3.4.0/gems/activerecord-8.0.2.1/lib/active_record/result.rb:236: [BUG] Segmentation fault at 0x0000000000000000
ruby 3.4.6 (2025-09-16 revision dbd83256b1) +YJIT +PRISM [x86_64-linux]

I have 530688 points in the database. When I export them manually, with

COPY( SELECT
        p.id, p.battery_status, p.battery, p.timestamp, p.altitude, p.velocity, p.accuracy,
        p.ping, p.tracker_id, p.topic, p.trigger, p.bssid, p.ssid, p.connection,
        p.vertical_accuracy, p.mode, p.inrids, p.in_regions, p.raw_data,
        p.city, p.country, p.geodata, p.reverse_geocoded_at, p.course,
        p.course_accuracy, p.external_track_id, p.created_at, p.updated_at,
        p.lonlat, p.longitude, p.latitude,
        -- Extract coordinates from lonlat if individual fields are missing
        COALESCE(p.longitude, ST_X(p.lonlat::geometry)) as computed_longitude,
        COALESCE(p.latitude, ST_Y(p.lonlat::geometry)) as computed_latitude,
        -- Import reference
        i.name as import_name,
        i.source as import_source,
        i.created_at as import_created_at,
        -- Country info
        c.name as country_name,
        c.iso_a2 as country_iso_a2,
        c.iso_a3 as country_iso_a3,
        -- Visit reference
        v.name as visit_name,
        v.started_at as visit_started_at,
        v.ended_at as visit_ended_at
      FROM points p
      LEFT JOIN imports i ON p.import_id = i.id
      LEFT JOIN countries c ON p.country_id = c.id
      LEFT JOIN visits v ON p.visit_id = v.id
      WHERE p.user_id = 2
      ORDER BY p.id) TO '/tmp/points.sql' WITH CSV Delimiter ',' HEADER;

I get a good export this way of the points, about 1.8GB large.

My system has 16GB of RAM.

Do I need to give sidekiq more memory or limit its memory maybe?

Freika avatar Oct 25 '25 20:10 Freika