amazon-kinesis-client-python
amazon-kinesis-client-python copied to clipboard
Running in containers causes problems due to stdout conflicts
Hi there,
I'm running this KCL in a Docker container, however due to the fact that my code writes log entries (to stdout to make docker log aggregation easier) it conflicts with KCL entries and displays unnecessary errors and in some cases causes the KCL to crash.
I'll get output like this coming through:
SEVERE: Received error line from subprocess [[2017-09-13 02:16:10.104][scheduler][INFO]: Deduplicated stream items - original length: 1, filtered length: 1] for shard shardId-000000000000
Is there any way to get around this other than resorting to logging to separate files? Ideally it would be great if I could configure it to use a custom unix socket (similar to how say php-fpm or nginx-uwsgi work) so that it doesn't conflict with services logging to stdout / stderr
I'm encountering a different but related issue, due to the way stdout/stderr is used. In my case I want to use Datadog to ingest logs, but it seems to watch docker logs
which is mostly "Sleeping..." messages.
Is there any update on this? We're running into a similar issue in that we want our Node.JS logs to be sent to Sumologic.
I'm wondering if there are any updates regarding this. Running into this myself as well. @ababushkin Did you find any way around?
Instead of using logger.info
use sys.stdout.write
will solve the above problem.