Strange results with connection pool and max connections
The setup
setting the max options to 50 connection
I am doing a Promise.all() on 20 items which all call the below code where db is the pool. This should do the implicit connection acquire and release.
const queryResults = await db.query(sql, [...])
The first time I start up the server I get good results. Below are the logs. About what I expect to see. 20 creates, 20 acquires, and then 20 queries/results in parallel.
created new client
acquired new client
acquired new client
acquired new client
fetching title 12131051: 136.794ms
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
created new client
acquired new client
fetching title 12185243: 286.826ms
fetching title 12192954: 289.766ms
fetching title 12309333: 290.212ms
fetching title 12334913: 296.210ms
fetching title 12272334: 296.751ms
fetching title 12325258: 297.172ms
fetching title 11816738: 297.913ms
fetching title 12243728: 298.523ms
fetching title 12314504: 299.291ms
fetching title 12334688: 299.719ms
fetching title 12023688: 300.237ms
fetching title 12224403: 300.751ms
fetching title 11783691: 301.228ms
fetching title 12288266: 301.652ms
fetching title 12309200: 302.096ms
fetching title 12273426: 302.641ms
fetching title 12243141: 303.026ms
fetching title 11973682: 303.338ms
fetching title 12291043: 304.839ms
Now for the bad. Any subsequent queries seem to always get batched in groups of 10. Always 10! In this case, you can see that we only acquire connections which is good thing. But it appears that the second 10 queries are being held up by the first 10. And their fetch times show it.
acquired new client
acquired new client
acquired new client
acquired new client
acquired new client
acquired new client
acquired new client
acquired new client
acquired new client
acquired new client
fetching title 12307964: 226.735ms
fetching title 12036267: 228.414ms
fetching title 11327462: 236.994ms
fetching title 12317528: 239.021ms
fetching title 12158784: 245.043ms
fetching title 12325244: 247.267ms
fetching title 12064670: 272.069ms
fetching title 11483117: 276.309ms
fetching title 11995063: 278.712ms
fetching title 11836988: 284.605ms
acquired new client
acquired new client
acquired new client
acquired new client
acquired new client
acquired new client
acquired new client
acquired new client
acquired new client
acquired new client
fetching title 12044935: 1212.355ms
fetching title 12117588: 1214.133ms
fetching title 12276631: 1216.341ms
fetching title 12124557: 1220.240ms
fetching title 12130505: 1222.686ms
fetching title 12234280: 1225.586ms
fetching title 12324841: 1228.692ms
fetching title 11832806: 1231.378ms
For a bit more debug info I started to log the pool info before each query.
This is one such info, why would we ever have 8 waiting while there are 19 idle? Is this just an event loop issue?
pool.totalCount 20 pool.idleCount 19 pool.waitingCount 8
Let me know if there is more anyone would like to see from our code.
Nathan
Is there any way that the max connection property is getting reset to the default of 10 after the first request?
Can you share the code for the example? Also, what's the max_connections on the server?
Pool configuration
const pool = new Pool({
user,
host,
password,
database,
port,
max: 50, // max active connections
idleTimeoutMillis: 60000,
})
GraphQL resolver
SearchResult: {
hits: async ({ hits }, params, context) => {
return await Promise.all(hits.map(titleId => Titles.fetch(titleId, context)))
},
},
Call inside of Title.fetch()
const queryResults = await db.query(sql, [languageId, titleId])
I tried out various combinations of tasks / pool sizes and I'm not able to reproduce this issue. Here's the code I tried out:
const POOL_SIZE = 10;
const TASKS_PER_ITER = 50;
const NUM_ITERS = 10;
const pg = require('pg');
const pool = new pg.Pool({
connectionString: process.env.DATABASE_URL,
max: POOL_SIZE,
});
pool.on('error', (err) => {
console.error('Pool error: %s', err);
});
const log = (message, ...args) => {
console.error('%s - ' + message, new Date().toISOString(), ...args);
}
const doStuff = async (i) => {
//log('Started i=%s', i);
const started = Date.now();
await pool.query('SELECT pg_sleep(.1)')
const elapsed = Date.now() - started;
log('Finished i=%s elapsed=%s', i, elapsed);
}
const doManyStuff = async (iter) => {
const started = Date.now();
//log('Called doManyStuff(iter=%s)', iter);
const tasks = [];
for (let i = 0; i < TASKS_PER_ITER; i++) {
tasks.push(doStuff(i));
}
await Promise.all(tasks);
const elapsed = Date.now() - started;
log('Finished doManyStuff(iter=%s) elapsed=%s', iter, elapsed);;
};
const main = async () => {
for(let i=0;i<NUM_ITERS;i++) {
await doManyStuff(i);
}
};
main()
.then(() => process.exit(0))
.catch(err => {
console.error('%s', err.stack);
process.exit(1);
});
You can tweak the pool size and number of concurrent tasks at the top. Depending on the values they either all execute concurrently (pool size > number of concurrent tasks) or they wait in chunks equal to the pool size.
See if you can reproduce the issue with a full example that does not involve any other modules.