pino
pino copied to clipboard
Logs are not flushed, missing log entries after process.exit()
This affects version 10.0.0 (latest) and earlier ones (9.4.0).
import { pino } from 'pino';
const logger = pino(
{
transport: {
target: 'pino-pretty',
},
},
);
const obj = { lorem: "ipsum" };
// create an array with obj clones
const arr = Array.from({ length: 999999 }, () => obj);
logger.info('start', pino.version);
logger.info(arr);
logger.info('end');
process.exit(0);
I would expect end
to be logged, but it isn't. The last lines shown in the console are:
999998: {
"lorem": "ipsum"
}
Repro with Node v18.16.0 and v20.0.0 under macOS 13.
what version of Node.js are you using?
Why are you mixing a destination with a transport?
This is Node 18 LTS.
I was previously only using the transport and added the destination as something recommended by various places to ensure logs are written synchronously.
the above should really throw an error as it's a weird combination that should result in unpredictable behavior.
Fair enough, please ignore this part for now though. The problem is reproducible without adding destination
. See the edited code in the issue description.
Remove the process.exit()
(which is doing what it is supposed to do, see the docs) and you'll see that this script actually just hangs after printing the massive array. I suspect some sort of garbage collection issue.
Sure, the array is a bit larger (just to make this example fail consistently), but I'm unsure where the cutoff point is in practice. I was able to repro that on Node 20.0.0 as well. So if I understand correctly, tweaking the GC settings may make this work and it's not the library at fault? (In other words, there's nothing pino
can do to fix it other than warn users about not logging egregiously large objects?)
Edit: That said, pino
without pino-pretty
works fine for the above example.
I put forth a WAG. Someone will need to put in some actual research to determine the cause of the problem.
The following would work:
import { pino, transport } from 'pino';
const stream = transport({
target: 'pino-pretty',
options: {
sync: true
}
})
const logger = pino(stream);
const obj = { lorem: "ipsum" };
// create an array with obj clones
const arr = Array.from({ length: 999999 }, () => obj);
logger.info('start', pino.version);
logger.info(arr);
logger.info('end');
process.exit(0);
I can see there is a race condition in pino itself at https://github.com/pinojs/pino/blob/master/lib/transport.js#L53. What we should be doing there instead is to synchronously wait for the worker to have processed all the logs. At this time I'm not affected by that bug and I won't be able to fix it anytime soon.
I see, thanks! Should I file this issue in pino
instead? I can work around it with sync or sleep as well, of course, so it's fine for now.
I've moved the issue.