data-prepper
data-prepper copied to clipboard
TCP source plugin
I am in charge of collecting application logs. I have java applications that has logging.config written in logback.xml. Loback.xml - part of sending logs to logstash looks like:
<appender name="STASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<destination>node1:port</destination>
<destination>node2:port</destination>
<destination>node3:port</destination>
<ssl>
<trustStore>
<location>file:/xxx/logstash.truststore</location>
<password>pw</password>
</trustStore>
</ssl>
<!-- encoder is required -->
<encoder class="net.logstash.logback.encoder.LogstashEncoder">
<customFields>{"...."}</customFields>
</encoder>
</appender>
With these settings on java APP server. The application sends data to logstash and Logstash is set to Server and on input has: TCP source plugin.
Can you add these feature to DataPrepper, so I can use DataPrepper instead of Logstash? I dont know another way to transmit logs to logstash from my app machine.
I am looking for alternative config that I have in Logstash OSS with OpenSearch Output Plugin:
input {
tcp {
mode => "server"
host => "IP"
port => "port"
ssl_enable => "true"
ssl_cert => "crt"
ssl_key => "key"
ssl_key_passphrase => "PW"
ssl_verify => "false"
ssl_cipher_suites => ['TLS_AES_256_GCM_SHA384', 'TLS_AES_128_GCM_SHA256', 'TLS_CHACHA20_POLY1305_SHA256', 'TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384', 'TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384', 'TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256', 'TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256', 'TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256', 'TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256', 'TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384', 'TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384', 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256', 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256']
ssl_supported_protocols => ['TLSv1.2', 'TLSv1.3']
codec => "json_lines"
tags => "ssl_TCPinput"
}
}
filter {
if [LogType] == "TrxPersist" {
mutate { add_tag => "trx_log" }
}
else if [LogType] == "TrxPostProc" {
mutate { add_tag => "trx_time" }
}
if [appname] == "INT_EDDIE" {
mutate { add_field => { "[@metadata][target_index]" => "eddie-int" } }
}
}
output {
if [enviroment] == "integration" {
if [appname] == "INT_EDDIE" {
opensearch {
hosts => ["IP:9200"]
ssl => true
ssl_certificate_verification => false
user => "user"
password => "pw"
index => "%{[@metadata][target_index]}-temporary-%{+YYYY-MM-dd}"
manage_template => false
}
}
else {
opensearch {
hosts => ["IP:9200"]
ssl => true
ssl_certificate_verification => false
user => "user"
password => "pw"
index => "trash-int-%{+YYYY.MM.dd}"
manage_template => false
}
}
}
else {
opensearch {
hosts => ["IP:9200"]
ssl => true
ssl_certificate_verification => false
user => "user"
password => "pw"
index => "trash"
manage_template => false
}
}
}
I had conversation on forum before this report: https://forum.opensearch.org/t/logstash-conf-converter-to-data-prepper/12082/3
Hopping over here from the forum. Could you use the LogstashUDPSocketAppender instead?
I will ask the application team. Data Prepper can work with UDP source?
https://github.com/opensearch-project/data-prepper/issues/2074 so UDP isn´t ready I think. What I need is:
source:
UDP/TCP:
port:
codec:
json_lines
this even not included in the roadmap after half year.
If there were TCP/UDP source plugins, would they support syslog protocol as well?
+1 for Syslog Source
+1 for Syslog Source