bunyan-cloudwatch
bunyan-cloudwatch copied to clipboard
Chunk logs to avoid hitting 1MB limit in putlogevents
From the docs, http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/CloudWatchLogs.html#putLogEvents-property
The maximum batch size is 1,048,576 bytes, and this size is calculated as the sum of all event messages in UTF-8, plus 26 bytes for each log event.
It would be nice to limit/chunk the amount of data uploaded, since a request over the limit returns error and crashes nodejs.
For reference here is how I handle it in cwlogs-writable, which additionally handles the maximum of 10,000 log events per PutLogEvents call: https://github.com/amekkawi/cwlogs-writable/blob/38ed37ab16aca9c249fa5747e574363508afc5d7/lib/index.js#L420
The AWS docs are accurate but could be a little more specific about what it means by "sum of all event messages". I tested this out and it means just the "message" property of each log event, plus 26 bytes.
So you can ignore all other JSON that makes up the PutEventLogs call, including the "timestamp".
Note: I notice that I'm simply using the string length in getMessageSize(). I'll probably change that to multiple the length by 4 (max bytes per UTF-8 character) since measuring the exact bytes is likely to be expensive.
I confirm this bug is this there.
I sometimes reached to 10,000 messages limits, which lead to an error
at 'logEvents' failed to satisfy constraint: Member must have length less than or equal to 10000
Just ran into this bug in a production environment 🤕 as well. Does anybody have a workaround or a pull request?
at 'logEvents' failed to satisfy constraint: Member must have length less than or equal to 10000
Just in case anybody else runs into the process crashing when this error happens, I was able to prevent it by properly defining an onError function on the CloudWatchStream after it's created.
const stream = createCWStream(opts);
stream.onError = function(err) {
console.log(err);
};