[RFE] Support Copying and Splitting the log stream
Copying: The ability to copy logs to multiple destinations. For example, I want to send a copy of my records both to the viaq elasticsearch and some other destination such as splunk/kafka.
Splitting: The ability to split my log stream into different subsets and send each subset to different destinations. For example, I want to send logs from the audit subsystem to a super secret elasticsearch, and send other non-security related logs to the regular elasticsearch, and send a copy to some other destination such as splunk/kafka.
That is, I should be able to Copy and Split at the same time.
We are missing a separation between inputs like : viaq, viaq-k8s, ovirt, example,etc.
and the outputs for them: viaq-elasticsearch, elasticsearch, local, remote-rsyslog, etc.
We currently assume that viaq and viaq-k8s should be sent to elasticsearch, example should be sent to local files or to remote forward(not sure if it only to rsyslog or any other destination using tcp/udp) , but that is a lock-in.
We should base the packages and configuration files based on the inputs and outputs.
The changes should be to the following variable: rsyslog_example rsyslog_viaq rsyslog_capabilities
@nhosoi @richm
For example, How would you configure these variables if you want to send viaq collected data to remote rsyslog?
Support for splitting of the logs has been partially added in PR #27. We can now use the output_name to tag the collected logs and filter them to the different outputs. We need to implement this for Viaq use case. Not sure if setting a new "output_name" variable right after the "imjournal" action is the correct way to add this to the records collected from journal. @richm @nhosoi . I have asked about this also in issue #42.
We still need to handle multiplying the data if needed to multiple destinations.