pino
pino copied to clipboard
Create depth limit option for the log line
As titled.
Previous ones #990 #1120
I am working on this, if I read it correctly this code
const pino = require("pino")({ maxDepth: 3 });
const o = { a: { b: { c: { d: { e: { f: 'foobar' } } } } } }
pino.info(o)
would produce something like this: {"a":{"b":{"c":"[...]"}}}
I am currently testing with this code
const pino = require("pino")();
const nested = {};
const MAX_DEPTH = 10 * 784;
let currentNestedObject = null;
for (let i = 0; i < MAX_DEPTH; i++) {
const k = 'nest_' + i;
if (!currentNestedObject) {
currentNestedObject = nested;
}
currentNestedObject[k] = {};
currentNestedObject = currentNestedObject[k];
}
pino.info(nested);
This is a very edge-case, a deeply nested object which is like this: { nest_0: { nest_1: { nest_2: {...} } } }
We reach immediately this line which calls JSON.stringify which does not support a max-depth feature.
In fact with that big MAX_DEPTH, it raises a RangeError: Maximum call stack size exceeded.
The same error is raised by the next catch block which tries json-stringifys-safe.
At this point the only way I see it is to parse the object and change it to removes all the keys at nesting level > MAX_DEPTH, before passing to the write function.
Is this the correct approach?
At this point the only way I see it is to parse the object and change it to removes all the keys at nesting level > MAX_DEPTH, before passing to the write function.
Is this the correct approach?
Not really. Essentially we should embed json-stringify-safe and implement this feature there.
I've had the same issue, however mostly around the size of the log and into how much money it translates. Been using https://github.com/runk/dtrim for quite some time - it covers all scenarios where you'd expect huge amounts of data to be logged: array, nested objects, buffers, big strings. Would be nice to see it used in pino, or have similar functionality for truncating lengthy structures.
@mcollina did #1169 resolve this?
yes good catch!
This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.