immudb icon indicating copy to clipboard operation
immudb copied to clipboard

"max active snapshots limit reached"

Open vespialis opened this issue 2 years ago • 4 comments

What happened I am using Immudb with pgsql wire protocol to save super simple data structures. So far I only have one table with a few records. After running for some time, when I try to insert or query the table, I get the max active snapshots limit reached error. After running the immuadmin database compact or immuadmin database flush , the error usually stops popping up in one environment (app), but instead I get no response when inserting/querying and on the other hand, while in another environment (psql) I get the error. I'm not sure, but it seems like immudb has a problem closing active connections. What you expected to happen I expect no error 😅. How to reproduce it (as minimally and precisely as possible) Create the table: CREATE TABLE event ( id INTEGER AUTO_INCREMENT, created_at TIMESTAMP, pipeline_id INTEGER, action VARCHAR, pipeline_hash VARCHAR, url VARCHAR, file_hash VARCHAR, metadata VARCHAR, PRIMARY KEY id); Just insert a few records in a short period of time and then try to query the table - the error should appear. Environment

immudb 1.3.0
Commit  : ce57d20a946c5a1ac5ec4b0be8c262f41527026d
Built at: Mon, 23 May 2022 13:01:40 UTC

Additional info (any other context about the problem) I would like to know if there is a way to set the maximum number of snapshots or what else I can do, as the database is useless in its current state.

vespialis avatar Jun 09 '22 09:06 vespialis

Hi @vespialis, thank you for your report. This issue looks lie the one we have fixed recently (https://github.com/codenotary/immudb/pull/1239) that will be a part of the 1.3.1 release (should be released soon).

Are you able to check the issue with the recent master code?

byo avatar Jun 09 '22 10:06 byo

@byo thank you for quick response. Following your advice, I ran some tests on the latest master code and yup, the error still pops up. But what I also found out is that it pops up after 99 SELECT queries, specifically when trying to execute the 100th one. There is no such problem when inserting data (I tried ~10000 queries and gave up since everything seemed ok). Restarting the database service helps.

vespialis avatar Jun 10 '22 12:06 vespialis

it might be the case the pgsql connector is not closing the reader once the query result is sent..

jeroiraz avatar Jun 10 '22 13:06 jeroiraz

@jeroiraz yup, btw. the problem doesn't seem to occur when querying with psql, it does when using node-postgres library (Node.js obviously).

vespialis avatar Jun 10 '22 13:06 vespialis