s3-streamlogger
s3-streamlogger copied to clipboard
Suggestion to add a potential memory leak warning to the readme
S3StreamLogger will keep the logs in this.buffers until it needs to create a new file. So if setting large rotate_every and max_file_size you can bump into a nasty memory leak.
A warning should be added to the README to warn against this, especially since the following 2 things cause you to believe it is not a possibility unless you dig into the code of the package:
- The package says that it implements a stream (which usually means things don't stay in memory) after they are sent away.
- The options rotate_every and max_file_size just refer to the way the files are rotated in S3.
In order to update an existing file in S3, this module re-uploads the entire file. Due to the way S3 works, it is not practical to do anything else. For this reason you should use a relatively small maximum file size.
Once a file rotates or reaches it's max_file_size or rotates in S3, are the logs cleared from this.buffer?
Once a file rotates or reaches it's maximum do the logs on the application clear?
Yes, buffers are cleared when a file is rotated due to reaching maximim age or size.
Thank you!
is there might be some potential logs losing with parallel executing?
@vks-dbb Yes, different servers/processes should write log files to different files. The default name_format
tries to ensure this.