amazon-kinesis-connectors
amazon-kinesis-connectors copied to clipboard
Support write to multiple S3 files/Kinesis table
It seems very logical to allow connector application to emit to multiple files.
I think the only way to currently implement is may be have multiple connectors to read from the same stream and emit to a different file/table A stackoverflow for the same problem is open for the past 6 months w/o any answers https://stackoverflow.com/questions/26108368/can-i-use-amazon-kinesis-connectors-to-send-a-stream-to-two-destinations-two-em
+1 for this question, just like how flume does it, based on the headers or some condition, we need to write it .
+1 for this question too. Now i need to launch multiple instance to work around with this
@sfai05 : the way i handled this problem was that I wrote a CompositeS3Emitter initializes multiple S3Emitters and takes a customer object having data ie byte[]
and metadata
about the data and emits to the respective S3Emitter using the metadata