timeout on logidze_compact_history
just added --limit 10 to a model
and now we're getting
ActiveRecord::QueryCanceled PG::QueryCanceled: ERROR: canceling statement due to statement timeout
CONTEXT: PL/pgSQL function logidze_compact_history(jsonb,integer) line 29 at assignment
PL/pgSQL function logidze_logger() line 172 at assignment
It's not obvious to me from the code why this could possibly be so slow
It's running in an 'upsert_all' in groups of 50 🤷
Starting an issue to start a discussion
- logidze 1.3.1
- Rails 7.0.8.7
- Ruby 3.3.6p108
Interesting.
Do you know how big was the log_data? The compaction happens in a loop by merging log entries one by one starting from the oldest one until we reach the max size (10). So, if you had hundreds of entries than it could take a lot of time, I guess (and, probably, in case of backfilling it makes sense to increase the statement timeout, or compact at the application level).
@palkan Here's an example collection of items being upsert. "field_5" is jsonb
[
{
"field_1": "value_1",
"field_2": "value_2",
"field_3": "value_3",
"field_4": "value_4",
"field_5": {
"field_5_1": "value_5_1",
"field_5_2": "value_5_2",
"field_5_3": {
"field_5_3_1": "value_5_3_1",
"field_5_3_2": "value_5_3_2",
"field_5_3_3": "value_5_3_3",
"field_5_3_4": "value_5_3_4",
"field_5_3_5": "value_5_3_5",
"field_5_3_6": "value_5_3_6",
"field_5_3_7": "value_5_3_7",
"field_5_3_8": "value_5_3_8",
"field_5_3_9": "value_5_3_9",
"field_5_3_10": "value_5_3_10",
"field_5_3_11": ["value_5_3_11_1", "value_5_3_11_2", "value_5_3_11_3"]
},
"field_5_4": "value_5_4",
"field_5_5": "value_5_5"
},
"field_6": "value_6",
"field_7": "value_7",
"field_8": "value_8",
"created_at": "2025-05-14T16:33:46.557Z",
"updated_at": "2025-05-14T16:33:46.557Z"
}
]
I mean, what's the size of the log_data for affected records? Something like:
SELECT jsonb_array_length(log_data->'h') // or MAX(jsonb_array_length(log_data->'h'))
FROM <scope>;