logstash-filter-csv
logstash-filter-csv copied to clipboard
Issue with Logstash Processing Updated CSV File
I am using Logstash to process logs from a CSV file and send them to Elasticsearch. Here is my Logstash configuration: input { file { path => "E:/AuditTrail/AuditTrail0.csv" start_position => "beginning" sincedb_path => "NUL" codec => plain { charset => "UTF-8" } mode => "tail" # Use tail mode to read only new data } }
filter { csv { separator => "," columns => ["RecordID", "TimeStamp", "DeltaToUTC", "UserID", "ObjectID", "Description", "Comment", "Checksum"] skip_empty_columns => true skip_header => true }
date { match => ["TimeStamp", "MM/dd/yyyy hh:mm:ss a"] target => "@timestamp" } }
output { elasticsearch { hosts => ["http://10.6.1.114:9200"] index => "audit_trail_logs" document_type => "_doc" } stdout { codec => rubydebug } }
The sample output looks like this: { "message" => "80,"7/8/2024 2:02:54 PM","-7:00","System","User administration","User logged off.",,ikay85\r", "@version" => "1", "ObjectID" => "User administration", "host" => { "name" => "DESKTOP-KE828UP" }, "@timestamp" => 2024-07-08T07:02:54.000Z, "RecordID" => "80", "Checksum" => "ikay85", "DeltaToUTC" => "-7:00", "log" => { "file" => { "path" => "E:/AuditTrail/AuditTrail0.csv" } }, "UserID" => "System", "event" => { "original" => "80,"7/8/2024 2:02:54 PM","-7:00","System","User administration","User logged off.",,ikay85\r" }, "Description" => "User logged off.", "TimeStamp" => "7/8/2024 2:02:54 PM" }
However, when I update the CSV file and add new lines, the resulting output contains parsing errors: { "message" => "ystem","User administration","User 'Admin' has failed to log on successfully 2 times.",,DtOX3A\r", "log" => { "file" => { "path" => "E:/AuditTrail/AuditTrail0.csv" } }, "event" => { "original" => "ystem","User administration","User 'Admin' has failed to log on successfully 2 times.",,DtOX3A\r" }, "tags" => [ "_csvparsefailure" ], "host" => { "name" => "DESKTOP-KE828UP" }, "@version" => "1", "@timestamp" => 2024-07-08T09:28:48.433441600Z }
What could be causing this issue and how should I fix it?