node-postgres icon indicating copy to clipboard operation
node-postgres copied to clipboard

`max_connections` threshold strategies for concurrent multi-database access

Open adamhamlin opened this issue 1 year ago • 8 comments

Obviously the pool is great when you're trying to limit concurrent connections to a single database. But in the scenario where your app is simultaneously reaching out to multiple databases on the same postgres server, it's not too hard to blow past the server's max_connections limit.

Any interest in a PR that would create a mechanism to throttle total clients across pools for a given host? Usage might be something like the following:

const serverPools = new pg.ServerPools();
const pool = serverPools.getOrCreate(connInfoForSpecificDb);
await pool.query('SELECT NOW()');

Obviously, there are lots of questions/details surrounding things like how you might throttle and fairly, but just curious what you think of the idea @brianc .

Thanks.

adamhamlin avatar Oct 17 '23 22:10 adamhamlin

@adamhamlin Is creating separate pg pool per used database not an option for you?

alpharder avatar Oct 22 '23 22:10 alpharder

@adamhamlin Is creating separate pg pool per used database not an option for you?

@alpharder unfortunately, not.

You could imagine a scenario where I need to reach out to 50 different databases. If I set the max connections per pool at 10, I could theoretically have 500 server connections. If my server's max_connections setting is 100 (the default), I can have a problem.

Now sure, I could configure each pool to only allow 2 connections per pool, but then I'm potentially limiting concurrency. Or I could increase max_connections, but maybe I can't/can't afford to beef up my server.

My problem is worse if I don't know ahead of time how many databases I need to connect to (like a DB crawler, perhaps).

adamhamlin avatar Oct 23 '23 13:10 adamhamlin

@adamhamlin yeah, I see...

The only thing that comes to my mind is PGBouncer

alpharder avatar Oct 23 '23 15:10 alpharder

Yeah, potentially. But that's a much bigger lift and involves other parts of the stack.

I'm mostly interested in an application code solution. Of course, these limits would only be enforced for a single pg.ServerPools instance--but that's no different than how pg.Pool works today.

adamhamlin avatar Oct 23 '23 18:10 adamhamlin

I think you should create a package for this!

charmander avatar Oct 25 '23 00:10 charmander

@charmander thanks, yeah I think it could be useful. Tho I don't know if it can be done "right" as a new package vs. an addition to pg.Pool with access to all the pool internals.

adamhamlin avatar Oct 25 '23 14:10 adamhamlin

It can.

charmander avatar Oct 25 '23 19:10 charmander

I think a "fair" paradigm when approaching max_connections might involve temporarily lowering the number of connections that a single database pool can have to allow for more pools to be created/utilized. That's the kind of thing I think one might want (need?) access to the internals for

adamhamlin avatar Oct 25 '23 19:10 adamhamlin